Episode Transcript
[00:00:00] Speaker A: Foreign.
[00:00:05] Speaker B: Welcome to ChatGPT Curious, a podcast for people who are, well, curious about ChatGPT. I'm, um, your host, Dr. Shantae Cofield, also known as the Maestro, and I created this show to explore what ChatGPT actually is. Really though, are the files in the computer, how to use it, and what it might mean for how we think, work, create and move through life. Whether you're skeptical, intrigued, or already experimenting, you're in the right place. All that I ask is that you stay curious. All right, let's get into it.
Hello, hello, hello, my curious people.
Thank you for joining me for episode 2 of ChatGPT Curious. I am your grateful host, the Maestro, and today we are talking about ChatGPT and the environment, energy, water and carbon emissions. So to start off, I made this episode as episode number two, intentionally. The environmental impact of ChatGPT is something that gets a ton of attention, but I, I couldn't help but wonder, like, is it a distraction more than anything, is it accurate?
You know, what, what, what should I be believing? So my takeaway, after about a week of researching this, and I'm telling you, I was in the computer, in the files, in all the articles, my takeaway is that when it comes to chatbots, aka ChatGPT and the environment, we should approach this very much like Ramit Sethi approaches personal finance. We gotta ask the 30, 000 questions, not the three questions.
Your actions, your interactions. I should say with Chat GPT use way less energy than whatever the sensationalized news is saying. If someone is trying to argue with you that you are used, that if you are using ChatGPT, you don't care about the environment, your best bet is to take their hand and together go and use ChatGPT to learn about the proven dial movers when it comes to the environment, which is going to be clean energy, a lower impact diet and efficient transport.
So in doing the research and outlining this episode, I was very aware that I, I, uh, didn't, and I don't want it to come off as a giant what about ism, um, piece, right? Where it's like, you know, folks are like, ChatGPT uses energy. And then I'm like, but what about everything else that uses energy?
But honestly, it is a valid response when we consider that impact equals magnitude times frequency, right? So when we're investigating, we have to ask what is the actual magnitude of the action?
And then go from there, right? We gotta go and get numbers. And that is what I did. My eyes are falling out now my legs have fallen asleep multiple times. But I went and got numbers. So I'm gonna break this episode into three parts. The first part's energy consumption. That's gonna be the bulk of it. Then there's water consumption and carbon emissions and there's gonna be, like I said, a very heavy bias towards energy consumption because that's what's on going gets the most press. And as I was doing my research, it was clear that the trends for both water and carbon emissions follow the reporting trend for energy consumption. So understanding that will give you a good, you know, intro level understanding of the other two. Right. So the last thing uh, I want to say is I will link all the things in the show notes.
Do um, not forget that I put out a companion newsletter that you can subscribe to and I share all of the this information from the episodes. But it's in written format. You can sign up for that, it'll be in the show notes, but you can also go to chatgpt curious.com forward/newsletter. Um, and for this episode I actually made a page on my website for the actual resources because they cannot go specifically in the show notes. There's just too many of them. So I will link that in the show notes as well. Uh, but if you go to chatgpt curious.com forward/environment-resources it will be there, but it'll be linked in the show notes as well. Um, and lastly, I'm going to be, you know, fully transparent with everything. I, I want to give a special shout out to uh, or nod if you will, to substack. Um, it was very helpful in, in guiding the direction of my search. So hopefully that doesn't color your view of this. Like I went to actual articles and things like that, but I'm not gonna lie, substack was uh, very, very helpful. All right, so let's get into it and talk about energy consumption. How much energy does a single chat g you use? This means both for the question and the answer. And the answer is about 3 watt hours per query.
This likely means nothing to you and I want to put it in context for you, but I'm going to show you how that number was arrived at.
Of note, Sam Altman, uh, whatever, head of, of OpenAI did confirm in a June blog post, his own blog post, uh, that a ChatGPT query used 0.34 watt hours of energy. But like, can you trust him?
Like I, I don't think so actually.
Uh, so I want to show you how this number was actually arrived at from other sources.
And I'm also going to go into the weeds a bit so you can see why you cannot just believe everything you read or see or hear.
Yes, I personally want to serve as a credible, trustworthy resource for all of you, but I will die on the hill that reading comprehension and critical thinking are, uh, the two most important skills we can possess in these times. And so I want you to question things and absolutely go and look for them yourselves.
[00:05:29] Speaker A: All right?
[00:05:29] Speaker B: So as it relates to coming to this number of 0.3, 0.34, uh, watt hours of energy per query, if you go through the sources of things and you're searching for this on the Internet, trying to figure this out yourself, the majority of sources that pop up lead back to a single article by Alex DeVries. DeVries. I don't know how to pronounce his name. My apologies.
And maybe it's a woman, now that I say that, I don't know. It, uh, leads that to this person, Alex DeVries, uh, which is an article and an article that that person published in Jewel magazine in November of 2023.
That article by Alex builds much of its argument on a quote that came out in February of that year. 2023.
In February of 2023, Alphabet's chairman indicated.
[00:06:22] Speaker A: Right.
[00:06:22] Speaker B: Alphabet head of. Owner of, of Google. Alphabet's Chairman indicated in February 2023 that interacting with an LLM could cost, excuse me, could quote unquote likely cost 10 times more than a standard keyword search.
As a standard Google search reportedly uses 0.3 watt hours of electricity, this suggests an electricity consumption of approximately 3 watt hours per LLM interaction.
Alex put two sources in that, that quote there, the uh, quote the statement, uh, that it likely cost 10 times more than a standard keyword keyword search. That was from the Alphabet chairman that came out in a Reuters article in 2023. It was just like something that he said, right? There's something that the head of Alphabet said in passing in this article.
In that statement, the Alphabet chairman said that Google, a single Google search reportedly uses 0.3 watt hours of electricity.
That metric, that statistic came from a 2009 Google blog.
Okay, so I'm not saying it's wrong, but, uh, you know my background, physical therapy. Many of you listening to this, you're coming from the health and health and wellness space and like, we typically require a little bit more data and testing before we just go, go along with something, right? But this article that's so commonly cited, and if you go do any research, you will see this. Alex Devries article.
It's published, it's cited so many times as just this is the, this is the basis for things that a Google that an LLM search uses 10 times more electricity than a Google search and a Google search uses 0.3 watt hours of electricity. And so therefore an LLM search must use 3 watt hours.
[00:08:14] Speaker A: Right.
[00:08:15] Speaker B: There's not that much to support this though, right? The 0.3 watt hours came from a blog post in 2009. Just one.
[00:08:23] Speaker A: Right.
[00:08:24] Speaker B: But, but Alex does go on in that article to write this figure aligns with Semiana Semi analysis.
That's where to say assessment of Chat GPT's operating cost in early 2023 which estimated that Chat GPT responds to 195 million requests per day, requiring an estimated average electricity consumption of 564 megawatts megawatt hours per day or at most 2.9 watt hours per request.
Semianalysis is an independent research and analysis company that's focused on uh, the semiconductor and AI industries.
[00:08:55] Speaker A: Right.
[00:08:56] Speaker B: So there's some credibility there. Independent, like they feel like they have less, you know, skin in the game with like they're not like someone's in their pocket hopefully. But the, the, the post that uh, Alex said, hey, this agrees with what they came up with in their post. Here's what that said. Our model is built from the ground up on a peer inference basis. On a per inference basis that means per usage. But it up with Sam Altman's tweet and an interview he recently did. We assume that OpenAI used a GPT3 dense model architecture with the size of 175 billion parameters. If you go listen to episode one, you guys know what I'm talking about. Hidden dimensions of 16,000, sequence length of 4K. Now this is the important part. Average tokens per response of 2000.
[00:09:39] Speaker A: Okay.
[00:09:40] Speaker B: Average tokens per response of 2K. If you go and listen to episode one, I talk about this, that's a lot of an output. 2000 tokens is like three pages single spaced. So this is saying that per question asked ChatGPT, in this case it was GPT3. The previous model than what we're using right now would spit out three pages of text single spaced.
If you've used ChatGPT, you know that this isn't true, right? That's just like an overestimation of just what an average return is going to be. If you're doing more, you know, intense investigations, uh, if you will, then yeah, it could come out with. That's very long though. 2000 tokens is a lot, right? The average response is probably more like 200 tokens. So that's like 50, 150 words, right? It's like a solid paragraph, single space, like a half a page, double spaced.
So if we take those numbers from that independent research and it's saying that it found that 3 watt hours per response would yield 2,000 tokens as an output, that means that, you know, if the math is correct, then we just need to correct for the actual usage, right? If we're going to take that number and say, okay, independent research, they showing their math, okay, I believe that 3 watt hours is the amount of energy that's used per interaction with ChatGPT. But that act, that interaction yields or outputs a 2000 token length response.
That's not how long on average they are. They're about 10 times, um, as short as that.
So that would mean we take that 2000 divided by 10 and we take that 3 watt hours and divide it by 10 and we get our original.
[00:11:33] Speaker A: Uh.
[00:11:36] Speaker B: Um, number or a much smaller original, a smaller number of 0.3 watt hours per query.
The flip side of this though, because we're doing more num. All numbers, we check everything, is that 2009 stat, that 2009, that of 0.3 watt hours for a Google search is probably outdated and is probably likely closer to 0.03, right? So because things get better and more efficient over time.
So we see that the energy usage for ChatGPT, right, was wrong by a factor of 10, but the energy usage for Google also, um, wrong by a factor of 10. So they cancel out. Meaning that it's still, yes, it can still be 10 times as energy expensive to do a ChatGPT search. But if we take that new number of 0.3 watt hours per query, right, suddenly it's a number that we were totally fine with for Google right before we were like, oh yeah, 2009. And then we use that same statistic in 2023 and it's like, yeah, Google search uses 0.3 watt hours and we didn't blink an eye, didn't bat an eye, and yet suddenly it's a problem.
Additionally, ChatGPT is doing way more right, it's doing way more with that output than, than a Google query.
But these numbers are likely continuing to be skewed.
Why? Because now Google, or I should say at least the Google number definitely is going to be skewed. Why? Because Google uses AI now.
So it is worth noting that in addition to the fact that everyone's Using the same article from Alex that is Based on a 2009 stat and a 2023 comment by one of the heads of Alphabet, which seems like a little bit of a conflict of interest. There's comparison of the two and just knowing anything for sure as it relates to the numbers and relates to the actual energy consumption, it remains really difficult because no one shares shit, right?
There are different models within ChatGPT which I will do an episode about, and each of those models requires a different amount of energy.
[00:13:37] Speaker A: Right?
[00:13:38] Speaker B: And Google responses, like I said before, also now have it, have AI. So it's like, well, how much energy is actually being used? What do we want? Transparency.
When do we want it? Right now.
So let's take those numbers though and do a little math and we can give it some wiggle room. So Sam Altman wrote 0.3, you know, watt hours per interaction.34 I should say. And uh, for that study, about 0.3 watt watt hours per interaction.
So let's see what that means in terms of our daily living, but we'll consider that. And also if it was 10 times as much. And so let's say it is 3 watt hours per query. All right, so I'm going to give you numbers for both. So I'm going to take it from the smaller guesstimate of saying that one ChatGPT interaction uses 0.3 watt hours of electricity and I will also do it if a ChatGPT query uses 3 watt hours per query because I think that it's all over the board, to be completely honest, especially with how dense uh, some of these outputs can be and just the different models. So let's give it to Moogle room.
Okay, now to do the math on this, we are going to do easy math and in terms of like daily usage and I want to go on the higher end again. And let's say that we're going to go with what would kind of what we assume would be like a power user. So if we say that this power uses, power user is doing like 100 queries a day, right? Um, that would mean we're either using our uh, boundaries of this are going to be 30 watt hours, right? So if we, if, if a single query is 0.3 watt hours and we're doing 100 of those, that would mean 30 watt hours per day. If a single query is 3 watt hours, that means that, and we do 100 of those, that would be 300 watt hours per day. So our range is going to be 30 to 300 watt hours per day of usage.
[00:15:31] Speaker A: Okay.
[00:15:33] Speaker B: In general, from what I found, the average user has about 8 to 14 interactions a day. So this is again on the higher side, but I'm like, let's round up, let's round way, way, way up. Okay, um, so real life examples of how much energy this is. Okay, and this is 30 to 300 watts.
30 watts, 30 watt hours, I should say. That is watching Netflix for 16 minutes. That is driving an electric vehicle for an eighth of a mile.
That is driving a regular car for a third of a football field. That is driving my Jeep, my Jeep Wrangler that I love so much, that only gets 12 miles per gallon. Guess how far 30 watt hours of energy gets me. Go ahead and guess. I get a number in your head.
[00:16:19] Speaker A: Guess it.
[00:16:20] Speaker B: 50Ft, folks. 50ft. I asked this to Lex and she was like, 100 miles. And I was like, no.
And this is why I did this, because I'm like, we don't know this stuff. We don't look at this stuff on a day to day basis. And it's important to put it in context. 50ft, 30 watt hours. So if I was to use 100 queries, 30 watt hours, and this is if it's at 0.3 watts, 0.3 watt hours per query, that would take me 50ft. Heating your house with forced air, which is what we have 21 seconds.
100 queries at point. And there's a lower end right on at 0.3 watt hours, 21 seconds.
Now if we take the upper, upper range of this and multiply it by 10 and let's say that each query is 3 watt hours and we're doing 100 of those a day. That is 300 watt hours per day.
We just look at the numbers from before and multiply it by 10.
So hundred queries, that would be the equivalent in this case of uh, streaming for two hours and 40 minutes, driving an electric vehicle 1.2 miles. And this is 100 queries, which most people are not doing. Maybe you're a power user and you're listening to this, but many people, again, eight to 14 interactions a day. So if you say 10 interactions a day, that's like 10 days worth of stuff.
Uh, you watch a whole lot more. Most, many people watch a whole lot more YouTube or, or Netflix than three hours across 10 days.
This would be driving a regular car just over three football fields. Again, this is if, if each query cost 3 watt hours per query and we did 100 of those, that's 300 watt hours.
That would allow. That energy, would allow you to drive a regular car just over three football fields.
My Jeep, 500ft. We're really going places now. That's like a little bit over a football field and a half.
Heating a home.
3 minutes and 33 seconds. Uh, 3 minutes and 36 seconds.
Hundred queries.
3 minutes and 36 seconds.
So let's, let's channel Missy Elliott for a second and put my thing down, flip it and reverse it.
We got three. Three. Excuse me. We got 30 watt hours at the low end, 300 watt hours at the high end. And this is just depending on how much energy a single query takes. And we're talking about 100 queries.
Okay, 100 queries is going to cost us anywhere from 30 to 300 watt hours. So let's flip this though, and see what one hour of normal tasks would cost us. Just because that's like an easy way to understand things.
One hour of heating your home with forced air costs 5000 watt hours.
One hour of heating your home, we're saying on the upper end, a hundred queries with ChatGPT, 300 watt hours.
One hour of heating your home, 5,000 watt hours.
Here's where we can say, you know, actions. If you were to switch and get a heat pump installed, that's a thousand watt hours, right? So each for an hour, 1000 for one hour. Feeding your home with a heat pump, that costs one cost in energy wise, 1,000 watt hours. So we see a change there. I get it. I'm not saying like, oh, just revamp your whole home, but you start to understand what actually moves the needle here.
I could not use ChatGPT for a day. And let's say that it's, you know, as a power user and that's 300 watt hours.
Or you change the heating in a home and in one hour you save 4000 watt hours of energy.
I, uh, we understand, like, people using ChatGPT, most of them, people using like 10 queries a day.
10 queries a day. And then think about the output that it gives you and what it can help you with.
This is a $30,000 question, which is something like heating a home versus a $3 question, which is using ChatGPT.
Dishwasher. One hour of running a dishwasher, that's 1800 watt hours. Using a clothes dryer for an hour, that's 3,000 watt hours.
Uh, again, I don't want this to be a whataboutism episode, but the point here is that, yes, we can be mindful of chatgpt usage. This is by no means an episode that's just like go do the fuck you want, just use it a million. I'm not saying that, but if we really want to move the needle, it is about looking at things that have a much more significant impact. And again, that's going to be energy sources, that's going to be lower, uh, impact diets, and that's going to be transportation.
I um, do feel like the joker here, right? Where it's like say you're gonna watch an hour of Netflix, nobody panics, tell someone you're gonna ask ChatGPT a question and everyone loses their minds. Like we gotta look at the actual math here.
[00:21:19] Speaker A: Right?
[00:21:20] Speaker B: I know that then people will say, but you gotta remember that you know, we were, that this, these smaller numbers get multiplied by all the users, right? And that number is wildly varied. Like it's a stats of there's like 500 to 800 million users, you know, 120 to 180 million daily active users.
This, what I actually think is important to take from that is that this speaks to the impact, the ener energy impact on the environmental impact of what's called inference. And that's why I wanted to bring this up is that if you read through things you're going to see a uh, word called inference that just means actually using ChatGPT.
And that's in comparison to training of ChatGPT, which tends to get a lot of the headlines. Uh, and in reality inference is probably more significant than the training portion. Portion, uh, just because there are so many people using it and it does use a lot of energy. But we also want to take this in context with everything else that uses energy and then think about what can we do to affect the most change.
Uh, speaking of attention, I would also like to note that data centers are not just used for ChatGPT chatbots. The majority of AI energy and that's what you're going to see in the news. It's going to be like AI is drawing all this. AI is forcing these things to be built. Yes. And I just want us to understand what, what's going on.
The majority of AI energy that goes into AI in general is like 80% of that is for what are recommended, what are called recommender models. So the things we've already been using of watch this next feeds on social media ads, search ranking, that gets the majority of energy.
Then we have things like computer vision and speech.
[00:22:57] Speaker A: Right?
[00:22:57] Speaker B: Uh, content moderation, auto captions, real time translations, uh, autonomous vehicle training, waymo I'm here for Waymo. Haven't tried it yet, but if it comes down to South Bay, I'm here for it. But that gets stored somewhere. Fraud, scoring, search ranking, other things in the proverbial shelf of these data centers. Streaming video libraries, app data, something called object storage. Just like videos, podcasts, photos, backups, things like that. All right, so, you know, we see ChatGPT catching all the smoke because it did kick off the AI arms, the AI arms race. So when we see that these data centers are being built, it's not for ChatGPT, chatbots, it is for AI. There's the big component of AI and other things that are being used in there. But yes, a big component of his AI, right? And what happened is that ChatGPT showed people that AI could be useful and, um, and impressive and usable and marketable. And so then Microsoft, Google, Meta, Amazon, they began reprioritizing AI at every single level we see. I saw a quote the other day on threads and it was like, everything that I see is powered by AI and packed with protein. And I'm like, oh, my God. Yes, it's so American. Yes, it's true. It's true.
So I don't. I'm not trying to have this episode be what about. What about ISM piece? I'm not trying to have this episode be like, don't even care about chat GPT of how much you're using. I just want this to be objective.
If we really want to move the needle regarding the environment, we need to look at the bigger needle movers. Bigger, bigger dial. Dial movers, clean energy, uh, lower impact diets, and, uh, efficient transportation.
As it relates to what's going on with AI and in AI, then yes, we should also look at specifically what type of AI. And this podcast and what we're talking about specifically relates to ChatGPT.
And the actual imprint of that is much smaller than people would have you believe.
Which is problematic to me because I think it is a very valuable tool. And so, yes, take individual responsibility, use it wisely, use it responsibly, think about what you're doing. But it very much is that impact equals magnitude times frequency, and it feels very akin to, you know, the plastic straw thing where it's like, okay, I got paper straws, but I'm flying on a transatlantic flight, right? That's. We gotta look at what's actually going to move the needle here.
[00:25:12] Speaker A: Okay, so.
[00:25:16] Speaker B: To round out this, this section, I'm just going to read a quote from one of the articles and all of the articles and everything. Again they are, they are going to be on that resource uh page. But Joshu wrote more transparency from OpenAI and other major AI companies would help produce a better estimate. Ideally, we could use empirical data from the actual data centers that run ChatGPT and other popular AI products. This may be difficult for AI companies to reveal due to trade secrets, but it seems to me that public confusion on this topic, including many exaggerated impressions of the energy cost of using AI today, is also not in AI developers interests. Taking a broader view, by some estimates AI could reach fairly eye popping levels of energy usage by 2030 on the order of 10% of US electricity. For this reason, I don't dismiss concerns about AI's overall impact on the environment and energy, especially in the longer run.
However, it's still an important fact that the current marginal cost of using a typical LLM powered chatbot is fairly low by the standards of other ordinary uses of electricity. My man nailed it. So let's go on to water. This is going to be a longer episode folks. It's not going to be like eight hours long. Um, and also listening to a podcast, very low energy usage. Actually listening to podcasts when that's not streaming a uh, YouTube video, a lot more. Um, but just there's a lot to cover in this. So that is our biggest section. We're going to go into water and then we'll wrap it up with um, carbon, uh, emissions. So water.
I went into the weeds a bit for energy because it gets all the press. But what I found when digging into water and carbon emissions largely mirrored that same trend.
[00:26:59] Speaker A: Right?
[00:27:00] Speaker B: The numbers that are presented seem huge and problematic for three, three reasons. Number one, we have been blissfully unaware and ignorant to how much energy and water we actually use on a daily basis.
And in regards to day to day tasks and you know, what our actual water and carbon footprints are, which I'm going to get to.
Number two, we need more transparency, right? These numbers seem problematic because we actually don't know too like is it this much, is it not this much? Like we need more transparency. And number three, at scale they do add up, right? So this again, this is not a what about is in peace. This is not a, you know, on the other end of it, it doesn't matter. I don't think about it at all. I want to be in the middle as much as I can. But more than that I just want to be objective.
[00:27:45] Speaker A: Right?
[00:27:45] Speaker B: So something to understand, I just want to give you some background about water footprint because if you Go and do any research, these kind of things will come up and it's good to just, you'll have some familiarity with them. So something to understand about a water footprint are the, what is called three scopes, right. There's different uh, things to consider, different factors that are factored in um, when doing the math and calculating the water footprint. So scope one re, um, what is the word I want to use? Uh, refers. Wow, my brain is broken. Scope 1 refers to server and facility cooling. Scope 2 refers to thermo power, thermoelectric power generation.
[00:28:21] Speaker A: Right.
[00:28:21] Speaker B: So for those who don't know, thermo, thermoelectric power works by burning fuel like we coal, natural gas, nuclear reactions, ideally something else. But this is what it's using, um, to heat water into steam which then spins a turbine to generate it electricity. Right. So the way that we, that we generate electricity is by spinning something really fast, right. We heat up water, make steam, it speed, it spins something really fast and then that generates electricity.
So uh, that in mind we as a little down the road. But like if we're thinking about renewable energy, it's just different ways to, that's different ways to heat that steam, different ways to get that thing spinning. Right. Um, but after the steam is cooled, which is often done by using water, uh, from rivers or cooling towers, it then condenses back into liquid and then we repeat the cycle.
Scope one is the amount of water used at the server and cooling at the facility. Scope two is thermoelectric power generation. This is usually done off site. This is actually typically where the majority of the water usage comes into play.
Um, there's a thing called water withdrawal and then water consumption. The numbers are typically reported as water consumption consumption and that's, that's water that's not returned to the, to the system. Water withdrawal is how much water is taken out. But much of that oftentimes gets returned. Whereas with uh, consumption that means it's just, it's gone, it evaporates.
[00:29:46] Speaker A: Right.
[00:29:47] Speaker B: Or it doesn't get put back into the system. So the majority of that's happening actually in scope two, which this is important to understand because a lot of the data that's out there talks about scope one, which is like what's going on at the server side, what's going on, the cooling at these data centers.
That's only like 15% of the total water that's actually needed for this. The majority is going to be scope two, which is happening wherever the power for this thing is being generated. And then scope three is the actual hardware, manufacturing and transport that does require water. So yes, I would love to see more transparency and have all three of these things reported. And actually a bigger focus on number two. So how much water does ChatGPT does a ChatGPT query actually use? Well, per Sam Altman's blog, that same blog from before, uh, it uses 0.000085 gallons of water. That's about 0.32 milliliters, which neither of these numbers really make. You know, I'm like, what the does that mean? But for those of you that know that stuff, great, that's how much, how much it uses.
Uh, but honestly, I do think that he's using misleading numbers here. And I know that I just said like, I don't really understand what this numbers mean, but I can understand what they mean in the context of other numbers that have been presented. And it just seems very, very low.
Um, I'm not sure that it takes into account the training cost, which other studies do, and they call it amortization, um, and basically like, they take the training cost and they divide it over like the, you know, use cases.
So that can be actually end up being marginal because it now gets used so much. That's that inference, um, that I spoke about earlier. But I also wonder if he's wrapping in the other scope metrics or he's just talking about at the data center.
[00:31:23] Speaker A: Right?
[00:31:24] Speaker B: This could just be possibly, this could possibly be just scope one data, you know, scope one information. And remember, like I said, only like 15% of the total water usage occurs at the data center. 85% is off site, you know, wherever it's getting power from.
So maybe you're listening to this and being like, well, maestro, how come you're okay, you were okay with the energy number, but now you're like, hey, the water number is really low.
Well, that energy number could very well be wrong. But that's also why I gave a range for my examples and said anywhere from 0.3, which is what he said, to 10 times that, which would be 3 watt hours. So that's part of that. I tried to like build the head factor that in. Um, but on the energy side, there are other outside studies that corroborate it. And it is easier to do, you know, back of the napkin calculations for energy, because energy would appear on the quote unquote utility bill, so to speak, because it's happening at this, at the site. Whereas water, especially the amount of water that often matters the most. Scope two, right? For that thermoelectric power that happens off site. So you're not going to be, not going to see it.
Um, but one of the articles that comes up a lot, and that's referenced a lot, um, is a 2025 article and it's called Making AI Less Thursday. Wow, that sounded weird. I said Thursday 2025. 2025 article titled Making AI Less Thirsty.
And in that it showed data that it took anywhere from 10 to 50 requests in order for GPT3 to consume a 500 milliliter milliliter bottle of water. And that's like the 16 ounce, you know, like the regular water bottles that we, that we've all bought at one point in our lives. The average of that was 33 requests. And that's what I'm going to use for the numbers to put things into context. Right, so 33 queries, 33 requests. And again that's the back and forth.
[00:33:16] Speaker A: Right?
[00:33:17] Speaker B: So it's the question and the answer. 33 of those.
[00:33:21] Speaker A: Right.
[00:33:21] Speaker B: I just want to be clear on that one request is the request and the response. Okay, so 33 of those uses one bottle of water.
I'm not saying that that's nothing, but one bottle per 33 requests kind of seems massive when I say it because we think about the one bottle of water that we like bought that one time. Like I honestly, I don't drink a lot of water. I'm not even gonna lie. Like I'm not, I don't feel bad about it either. Like I'm just not a big water drinker. But um, we're like, oh yeah, man, like I bought that. Like it's not that much.
And again, this is, I'm gonna, you know, the hat tip back to substack. Uh, there's a, there's a substack by a guy named Andy Masley and he makes this point very well. And again, I link all the things in that, that reference that resource.
Um, but so on one hand we. One bottle of water per 33 requests seems like a lot because we can like conceptualize like I bought water that one time and it's like a lot.
But we never think of anything else in terms of water bottles consumed on our, you know, anything ever, especially not in our day to day.
So let's do some math here. 33 requests, one bottle. Most people are doing about 8 to 14 requests per day.
[00:34:33] Speaker A: Day.
[00:34:34] Speaker B: So it's not a bottle per day.
But let's say from the power user, I'm definitely a power user. I use this thing A lot more. You know, if we're saying 100 requests a day, then three bottles.
I saw a thing, a paper that came out and it said 20 requests. It was like a, um, I don't say a paper. It was articles. And when I say articles, like a publication, right? Like a, like a, you know, Washington Post online. Actually, I saw a really, really bad Washington Post article. Don't find that one. That thing is trash. It's just very, they seem to confuse, uh, watt hours and kilowatt hours. And that's very bad. That's a, that's a thousand times different.
Um, but I saw one that said 20, uh, requests equals one bottle, which is interesting because it came from this same study. But if we say 20 requests is one bottle, 33 requests is one bottle, that's like three to five bottles per day for a power user.
It's not insignificant. I'm not here to be like, it doesn't matter.
But let's compare apples to apples. If we just look at the electricity per U electricity use per person per day in the U.S. according to the Energy Information Administration, the EIA. I was all in the websites, folks. I was all up everywhere.
The average person in the United states uses about 11.4 kilowatts hours of energy per day per person. So that's 11,400 watt hours.
Right before we're talking about watt hours, I want to keep it the same. We said 3 watt hours, 0.323 watt hours per query.
We're saying that the Average person uses 11,000 watt hours per day in energy it takes him. So we're in the water section now though. But it takes about two gallons of water to produce one kilowatt of electricity.
[00:36:24] Speaker A: Okay.
[00:36:25] Speaker B: Um, according to the US EPA. So if we do a multiplication there two times, that's the two gallons of water times the 11.4 kilowatt hours.
That is 22.8 gallons of water per day that you use. That I use just for our electricity needs. And I probably use more than that. I, I say this not to absolve myself by any means, but to like, I want, you know, like I'm not here, like from, you know, on a pedestal or from a glass house or like, you know, anything like that. Like I'm right there with you in the trenches and I'm like, I should probably look at my life.
It's been very eye opening for me.
So 22.8 gallons of water per day just for our electricity needs.
22.8 gallons of water is 173 bottles. This is just for electricity, folks. I didn't get anything else. This is just for electricity.
We're going to add to that another 80 gallons per day that comes out of the faucet, right? And there are, uh, resources for this in the. That, that page that I told you I'm linking. Like you can go and check this thing.
[00:37:33] Speaker A: All right?
[00:37:34] Speaker B: 80 gallons per person per day for things that come to the faucet. So drinking, shower, toilet, teeth, brushing, things like that.
That's 606 bottles of water.
So 606 plus 173. That's what? 779 bottles of water.
Just from electricity usage and water usage per person per day.
779 per day, right. And we said ChatGPT, power user. Three to five bottles a day.
Now, folks, where it really starts to get crazy is if we factor in food and the water that's required to produce that food, especially beef. And dude, I have a burger for breakfast. I'm putting my shit out there, right? Like, I have a burger for breakfast.
We do a bison, um, for dinner. Not every night, Right. But when we factor that in the amount of water that's required to produce food, especially beef, we can see the number. The amount of water per person per day can go as high as 1800-20. 200 gallons of water per person per day.
That's insane. 2,000 gallons of water.
That's 15, 152 bottles of water.
15,000.
And we're worried about three to five as a power user.
Again, I'm not saying go say it. I'm saying, hey, if we want to do something about it, let's look at the things that move the needle the most.
Because I do not want this to be a, uh, whataboutism episode.
[00:39:28] Speaker A: Right?
[00:39:28] Speaker B: But there are articles out there that are saying by 2028, AI could use. AI uh, in the US could use. Could require as much as 720 billion gallons of water annually just to cool the AI servers.
That's a lot.
But irrigation in this country uses 27 trillion gallons a year. The irrigation in this country uses 757 billion gallons per day.
So we have a number here. By 2028, AI in the US could require as much as 720 billion billion galleon. Wow. 700. I lost the emphasis there because I stumbled over my words.
Could use. Could require as much as 720 billion gallons of water annually, right? AI in the US 2028. 720 billion gallons annually.
Irrigation in the US uses 757 billion gallons per day.
That's 27 trillion gallons a year.
But thermoelectric power, that uses a trillion gallons per year, suddenly it's like, hey, wait a minute, where's all the water actually going?
The big takeaway for the water section, this water section, in my opinion. And I'm going to really push Liz. Liz that have. Wow. Liz the developer. That's um, her Instagram handle and I will link that in that. Go to that resource, um page. She has a, she made a site just for environmental concerns, right? Because she was getting them so many. And she's like, guys, this is actually a small thing to worry about. We have way bigger concerns regarding AI and the fact that like the worst people ever are using it and you know, the way that it's trained is other things to be more concerned about than the environmental impact. And yes, not, not, not saying don't care about it at all, but just understand the magnitude of things.
Um, but definitely check out her, her work on that. But my big takeaways from all the research that I did with water was number one, follow someone that's way smarter than me regarding the agricultural changes and what can be done and what we can like, you know, vote for and things like that.
Second part there. AI's environmental footprint is largely determined by the source of electricity to produce power. It.
[00:41:56] Speaker A: Right.
[00:41:56] Speaker B: Renewable energy would significantly decrease the impact of AI that. Understand here that the AI itself is not problematic. The technology itself is not problematic. It's how it's being powered. Yes, it needs a lot of power, but it could be powered by cleaner energy by renewable sources.
[00:42:13] Speaker A: Right.
[00:42:14] Speaker B: So understand that AI's environmental impact is primarily a function of energy sources, not the technology itself.
It is getting more efficient.
[00:42:22] Speaker A: Right?
[00:42:22] Speaker B: The technology is also getting more efficient, which though I will say could be offset by the increased usage.
[00:42:26] Speaker A: Right.
[00:42:27] Speaker B: The big dial mover at the end of the day is how it's powered and that is renewable energy, which is why it's sad to see. And we do, you know, if you actually, if you watch more of the government said governmental, the government proceedings, things like that. And Trump keeps being like clean coal. That's not even a real fucking thing. Like that is a marketing name.
It's problematic. We don't want things powered by coal. We want solar, we want um, wind.
[00:42:55] Speaker A: Right.
[00:42:55] Speaker B: We want a hybrid of things.
But no coal as the most is not going to be the most ideal with this.
[00:43:02] Speaker A: All right.
[00:43:02] Speaker B: But again, follow someone else that like this is what they study and it's what they do and Understand that, like, that is what I want you to take away from what I'm saying to you is, like, that's the biggest dial mover. It's not. Not using chat GPT. That's not actually going to move the dial. And I think that it's actually detrimental because it can be so helpful. It can help you learn so much. And I'm excited to get into that in another episode of other episodes. I'm excited to hear how you folks are using and what you're learning.
[00:43:26] Speaker A: Right.
[00:43:26] Speaker B: You know, AI is being used to try and.
Try and do the math behind this and, and solve these environmental problems and figure out how to, you know, create more sustainable and renewable energy.
But that is what's going to move the dial. It's going to be the energy sources that are powering it.
[00:43:42] Speaker A: Right.
[00:43:43] Speaker B: So the action item here is pushing for transparency in general so you can know what the fuck's happening. We definitely 100% do need more transparency. More transparency, absolutely. And we also need to push for renewable energy, not just for the. For AI, but for the infrastructure.
[00:43:59] Speaker A: Right.
[00:43:59] Speaker B: Uh, and, and what's powering it.
So, last part of this episode. Carbon. This is the last part of the episode and it's the shortest, largely because folks seem to be talking about it at the least. At least, you know, what comes in the news, um, what comes across. M. You know, my almost in my desk, like, my cat Rupert's bringing it to me, like, but you guys understand what I'm saying? Um, but I'm keeping the shortest because the trend and the understanding is largely the same theme as energy and water.
One needs to know two things in order to calculate the carbon footprint of any model. One, the amount of electricity it can consumes, and two, the carbon intensity of that electricity, AKA how the electricity is produced.
So we don't have that, right? We kind of do, but we don't. We see that, like, how much transparency would help. So, yes, we can get some theoretical numbers, but it's kind of like it was with the energy side of things, where it's like, we need more transparency and right now we're just guessing. So without just completely losing you with numbers.
[00:45:00] Speaker A: All right.
[00:45:00] Speaker B: And giving you more numbers, what I want you to take away is just like with. For the other two stats or the other two sections of energy and, um, electricity and water, the numbers in mainstream media. Mainstream media, they may be inflated or, or at least misleading. And so I suggest that if you see them, go and compare it to the carbon footprint of the biggest contributors of Other things. So you can just get a reference point for this. And yes, this requires some, you know, effort on your, your point. Uh, on your side. Yes, your part, yes.
Again, we need more transparency around how much energy does this thing actually use? What. What energy sources is actually coming from.
If we look at what ultimately the solution is. Once again, though, it is renewable energy, right? It's not use it less because even if you stop using it, there's. It's being used for other things, right? So what I'm saying, even if you stop being like, I'm not using chat GPT, okay, It's a fraction of this footprint. A fraction. It's not gonna make a difference, right? And I hate to say this because someone's gonna like pull that as it sound like, like who's thought like, don't do that. But that sound bite by itself, I understand could be problematic, but like it is a fraction of the total energy consumption. And so stopping using an LLM is not the solution. Renewable energy. Pushing for renewable energy sources, that is the solution.
So let's summarize this bad boy, this episode, right?
It is a bit of. Not a. Not a bit. It is the wild west out there and when it comes to searching for this stuff and a lot of the searching leads in circles with the same few articles being cited over and over and over and over again, both academically and journalistically, right? So when I say journalistically, I don't know if that's a real word, but what I mean is like mainstream media and like substack, like on both ends of the spectrum, people like kind of citing the same people, clickbait headlines.
Estimates tend to be overstated and more transparency would be helpful, right? Because one, I think then the numbers couldn't be just as easily overstated and manipulated. But uh, the transparency is important because the models are changing and the hardware is also changing. And so we don't know how much per query and to just be like, okay, it's 0.34 watt hours. And it's like, is that for how long of a query is that using oh, three Deep research. Is that using four. Oh, like, is that four. Oh, mini.
[00:47:37] Speaker A: Like what.
[00:47:37] Speaker B: What model are you using?
[00:47:41] Speaker A: Transparency.
[00:47:41] Speaker B: Um, I want, we all want transparency.
I maybe will do another episode as to why. I think that it's not transparent, but I think that, you know, it's about money. I think it's about continuing to get funding. I um, think it's this arms race. I think that again, it's money. And if you say that it's bigger, it's better, it does the best thing, it's the fastest thing.
Uh, it's the most efficient. It doesn't use water, it's, it's clean, it's great. Then you get more funding because no one does any research and that's what people want, is more money. So problematic. Let's give me more transparency.
Um, but without this becoming a what about it, what about ism piece, I just want you to understand that the energy and water consumption pales in comparison to what is already being used. And if we really want to make a change, we should go after the big dial movers, clean energy, lower impact diet and efficient transport.
Additionally, like this can be and is incredibly helpful, right? Like it's not like we're saying that this energy is being used for this. Like shenanigans like LLMs can be incredibly, incredibly, incredibly helpful. So it's like, yes, I'm okay that energy is being used for this, but let's be mindful of it. And then let's also see what else it's being used for and let's fix that. Especially because that energy consumption is way, way more.
[00:49:01] Speaker A: Right?
[00:49:01] Speaker B: So like, you know the initial stats of like, oh, an LLM search uses 10 times as much energy as a Google search. Well, yeah, because it's doing a hundred times more better of a job.
[00:49:12] Speaker A: All right.
[00:49:12] Speaker B: And we can't even say that number if it is even true anymore because Google uses AI. All right, we could compare this to streaming, right? Which it uses significantly less energy then. And that's not helpful for us, right? We'd be sitting there, Netflix and rot. All right, Netflix and watch your brain like, and I do this. I love watching Netflix. I love having on the background. I'll be putting selling sunset on while I do work.
And it's like, okay, let's think about how we're using things and let's compare to what's actually being used and look at it in that context and then say what can we do?
So the flip side is, yes, I am also concerned about the growing energy usage and as especially as it relates to LLMs because of what's called agentic models, which is. And I'll do an episode about this which is basically AI using AI to complete tasks. LLM using LLMs to complete tasks.
Suddenly that's a lot more compute. Uh, that's a lot more requests going on. So this episode is by no means a permission slip to not care.
By no means if you are concerned about the environment and that's why you don't want to use ChatGPT. Okay, I'm never here to force you to use it. Like, that's fine.
But also, you should consider doing things that more significantly contribute to that equation. I gave earlier that impact equals magnitude, time, frequency.
If you are concerned about the. But my guess is that you probably want to use ChatGPT. That's why you're listening to this podcast. But either way, uh, if you. My main thing there is that, like, I'm never here to force you to do anything that you don't want to do.
If you are concerned about the environment and you want to use CHAT GPT, go ahead.
What? It's a. Both and right. We can be mindful of how we use CHAT GPT, which is being more efficient with prompts, being aware of what model you're using, using the leanness model. And I will go into that in another episode.
But also doing the things that more significantly contribute to that equation of impact equals magnitude times frequency.
[00:51:05] Speaker A: All right?
[00:51:06] Speaker B: And magnitude of what we're doing with Chat GPT is very marginal compared to the other things that already exist. Again, I'm not saying don't worry about it, don't think about it, don't be concerned about it. I'm saying both and go ahead and be mindful of it and then make sure that you're doing the things that are actually going to move the needle.
So, like, specifically, like, what could you do? Because I'm over here being like, we need renewable energy. And a lot of this is about policy and infrastructure. And, you know, where are these data centers being put? And honestly, that's part of that is like, what are the. What's the local government doing? And are they fucking reading anything? My guess is that a lot of them aren't, and they're just looking at the incentives. And so then they put in a data center and suddenly there's like, no water, right? And it's like, you gotta read the fine print and understand what this actually means for things, and we gotta push for renewable. And in certain places, we're looking at desalination plants. Like, it is getting in the weeds. But specifically, what can you do?
You could walk, right? That's a start walk. I, I try to walk more now for sure. Uh, I coach like once a week at this gym. That's like, literally, you know, on the block for me. And I'm like, I'm walking instead of driving my Jeep. I love that Jeep. But instead of doing that, I'm gonna walk, get an e Bike. That's still. Yes, there is still, you know, cost to that, but that's better than driving. Use LEDs in your house. Turn off the lights. Shorter showers. Turn off the ac. Right, we see the difference of like one hour of using forced air of like, what was what I say 5000 watt hours just for an hour of using it. Turn off the ac, turn off the heat. Less Netflix and chill. Go advocate for things. Vote. Look at your energy pill. I. I'm a little scared too, but look at your energy bill. Change your diet, right? Go follow and learn from an environmentalist who you like. Just pick one of those things and get started.
I realize that, you know, this is a long episode and everything that I've been saying, all the numbers, it just sounds like a lot and maybe it's a giant eye roll to you.
This episode took me a little under a week to put together. And I'm not gonna lie, I'm really proud of it. Um, but it was lots of research and going in circles and lots of numbers. And I don't say this to brag. I say it because the first thing that comes to mind when thinking about that lift is that this idea of participatory democracy, right, if we want the freedoms and benefits of something as awesome as this, we have to participate in shaping it, right? With great power comes great responsibility. And that starts, in my opinion, with awareness and understanding. And my goal with this episode is to make you aware of it and hopefully understand it a little bit more. I know I speak quickly. I know that I threw a lot of numbers out there. I tried to keep them, you know, not too, not too, not too difficult.
You can go watch it again. I don't know where you'd be watching it. You can go listen to it again.
Don't forget I got the newsletter if you want to read. And, um, don't forget I got that list of resources on my website, chatgpt, curious.com forward/environment resources. I'll link in the show notes. It's just too long to have that list like in the actual. So just it's too long.
Um, but go there and look for yourself and, and do some reading and. And it's a start, right? So gonna wrap the episode up there. Uh, like I said in the last episode, I want to also have a segment or section of them. I guess I'll segment where I say how I use ChatGPT recently or today. And clearly, uh, I used it quite a bit to help me find sources and to do math for this episode. Um, of note, yes, I did the, I did use the, the quote unquote, more power hungry O3 model quite a bit. That's the deep research model. Um, as it was a better fit for what I needed and some of the, the more complex reasoning when I was trying to work through ideas and like, I literally felt like I had to become like a physicist and like an environmentalist and like, like a, like a. Ah. What are they called? Um. Oh my God, I'm totally missing the word right now. Um, wow. I got through so much of the episode without losing my thought and I can't remember it, but either way, I literally felt like I'm becoming a physicist and I'm an environmentalist and I'm, you know, working on all this stuff to try and record this episode and like, understand the numbers and everything. So, yeah, that is how I used it most recently. Um, but it's my hope that you can use this episode as a resource and if you found it helpful, do me a solid and share it with somebody who is curious about ChatGPT.
Don't forget I got that companion newsletter that drops every Thursday and that is basically the podcast episode for that week in text format. So if you prefer to read or use from a written record that you're probably never gonna look at again, join the newsletter fam. You can head to chat GPT curious.com forward/newsletter or you can check out the link in the show notes. The word I was thinking of was engineer. It came back to me. Engineer. I'd become an engineer. Literally though, folks like trying to understand power grids and desalination and water consumption so much, and I just have like, literally the tip of the iceberg. But again, these are things that ChatGPT can help you with and help me with. So any questions, any comments, any concerns, additions, subtractions, requests, anything like that, head to the website and use that contact form because I would love to hear from you. I know this was a longer episode, so thank you for sticking it out. Uh, this topic is nuanced and it requires that much more real estate. And in the end, I really, really do think that our curiosity will benefit us. So one last summary and then I'll wrap it up. The numbers in the clickbait titles are misleading and at times incorrect. They typically overshoot the individual impact of using ChatGPT. An individual uses 11,000 watt hours of electricity per day, and yet we're worrying about, you know, 30 to 30 watt hours from using ChatGPT.
The average person is using, you know, one to three or excuse me, we're using one to three bottles of water, you know, for these, these same queries or I'll, I'll even round it up one to five bottles of water when we use about 15, 000 bottles of water a day.
[00:57:05] Speaker A: Right.
[00:57:06] Speaker B: Electricity at the faucet and, and food.
So electricity comma at the faucet, water that's used at the faucet and then food.
[00:57:14] Speaker A: Right.
[00:57:14] Speaker B: Uh, we're worried about one to three, one to five bottles of water coming from chat GPT usage, whereas we use 15, 000 bottles a day.
And if we go in the bigger, you know, picture again, US irrigation using 757 billion gallons per day. It's different things. There's bigger things that will move the needle for us.
Like the biggest dial mover for OpenAI, which is the owner, you know, parent company of ChatGPT, are where the data centers are built and what energy they use. So as an individual, yes, be mindful of your use, but your biggest dial movers for helping the environment are clean energy, a lower impact diet and efficient transport.
That's all I got. All the summaries I got, that's all the announcements I got. As always, endlessly appreciative for every single one of you. Thank you for tuning into this one. Thank you for sticking with me. Truly, truly, truly grateful.
Until next time, friends, stay curious.
[00:58:16] Speaker A: It.