What I’ve Learned From The Worst Computer Bugs

Be humble in the face of software complexity

Clive Thompson
7 min readMay 27


Photo by Michael Geiger on Unsplash

Weirdly, I kind of enjoy encountering software bugs.

My own bugs, to be clear! I’m an amateur self-taught programmer who writes little tools to assist in my journalism (or makes things just for fun, like my Weird Old Book Finder). Because I’m just a weekend hacker, I make a lot of mistakes while coding — so I’m constantly puzzling over the strange behavior of my software, and hunting down and fixing my own bugs.

My family is thus accustomed to seeing me sit there for hours, only to suddenly shout “aha!” and slap my forehead (followed by some gentle self-recrimination, like “ugh, how totally, totally obvious!”).

Bug-fixing is detective work, which makes it deeply satisfying. Much like as with a parlor-room mystery by Agatha Christie, you typically have all the necessary clues laid before you. The challenge is in spying the throughline that ties them together and solves the puzzle. The surge of pleasure that comes from fixing a bug is the same narcotic jolt of delight you get when you tease out the solution to a mystery novel before the author reveals it.

Bugs are also humbling. They teach humility in the face of complexity, because there are so damn many ways to screw things up in code.

I was thinking of this the other day while reading archives of historic bug reports. And since I love making lists — and in fact no longer even try to resist — here’s my roster of Six Historic Bugs And What They Taught Us:

1) The Y2K bug

This is maybe one of the most famous software flaws of all time, simply because of how widespread it was.

The problem began back in the 60s and 70s, when businesses and government agencies were beginning to seriously embrace software as the way to run their operations. Back then, memory was expensive, so developers tried to save space anywhere they could. When they wrote code to manage dates, they often truncated the year to the final two digits — so “1989” would be “89”. As one former COBOL programmer told me a couple of years ago, “back then, resources were so tight that saving those two digits really mattered. And nobody expected the code…



Clive Thompson

I write 2X a week on tech, science, culture — and how those collide. Writer at NYT mag/Wired; author, “Coders”. @clive@saturation.social clive@clivethompson.net