AI Is Thirsty

Each chat with a large language-model is like dumping a bottle of water on the ground

Clive Thompson
9 min readJul 30, 2023

--

Photo by Jacek Dylag on Unsplash

Today I used ChatGPT to get some help making a browser plugin. I posted my queries, then watched as the code and text spilled down the screen. This is the part of large language-models that I dig! As a hobbyist developer, getting suggestions of customized lines of software can be a powerful way to learn.

But as it turns out, using ChatGPT consumes a lot of an unexpected resource:

Water.

The code wasn’t quite what I was looking for, so I chatted with ChatGPT for 15 minutes or so, slowly coaxing it to revise. By the time I was done, we’d gone back and forth about 20 times.

And during that exchange? Microsoft’s servers probably used about as much water as if I’d just bought a half-liter bottle … and spilled it on the ground.

AI, it turns out, is incredibly thirsty tech — ploughing through torrents of fresh water every day. Given that we’re likely to see large-language-model AI woven into ever more apps and appliances these days, it’s worth pondering just how much water our booming use of AI will consume.

Why precisely does large-language-model AI require water? Back in April, a group of researchers pondered this question as they

--

--

Clive Thompson

I write 2X a week on tech, science, culture — and how those collide. Writer at NYT mag/Wired; author, “Coders”. @clive@saturation.social clive@clivethompson.net