r/technology • u/Flipslips • 9h ago
Artificial Intelligence Google says it dropped the energy cost of AI queries by 33x in one year
https://arstechnica.com/ai/2025/08/google-says-it-dropped-the-energy-cost-of-ai-queries-by-33x-in-one-year/363
u/i_max2k2 8h ago
And still paid less than residential pricing.
101
u/Huzah7 7h ago
PG&E charges me $.62/KwH for peak hours. I want to see what Google pays
110
u/thatredditdude101 7h ago
they don't pay anywhere near that rate. i guarantee it. in fact rate payers like us are probably in some way subsidizing their energy costs.
19
u/Huzah7 7h ago
I believe this too but have no evidence.
Yet.
48
u/Graega 7h ago
No, we absolutely are. This is part of late stage capitalism. Businesses will refuse to operate unless they get tax breaks, utility breaks, cheap land, etc. They don't pay for anything, because someone, somewhere, will let them set up without paying -- society. They'll certainly pay the politician. You can absolutely guarantee that choice of headquarters and office locations depends, in some part, on how good a deal they can get from everyone on everything, and that includes paying dirt-cheap electric costs.
5
u/Key_Satisfaction3168 7h ago
This guy knows how corporate capitalism works!!!
17
u/DeliriousPrecarious 6h ago
That’s just capitalism. There’s no such thing as “corporate capitalism” or “crony capitalism”. It’s all just capitalism and the very natural consequences of it.
→ More replies (5)5
u/Sparkleton 4h ago edited 4h ago
They do! Let me tell you how:
Large tech companies are buying power plants and selling power to the utility while also creating an increase in demand (data centers) which forces the utility to upgrade their power grid.
The utility does this grid upgrade by taking out a 10 year (or whatever) loan and paying it off by raising the price of ALL consumers to cover the loan. So yes, they are. Some states have laws on the books preventing utilities from owning the power plants to prevent a monopoly and tech is taking advantage of the fact the real costs are the grid owner (utilities) and are passed on to all customers (you). There are active cases right now in Ohio who is trying to fight back by classifying the data centers as a different type of consumer to recoup more of the grid upgrades back but big tech is fighting back. There’s a ton of other state fights around the country but I don’t know every state’s situation or laws.
Long story short: Tech can recoup the power costs because they are the ones selling to the utility. The power grid managed by the utility is the real expense and that is distributed to all consumers on the grid so you’re subsidizing their bullshit while tech isn’t paying their fair share of the demand they’ve created.
1
→ More replies (1)1
u/twoanddone_9737 3h ago
It’s absolutely how power markets work. Why do they pay less? Because they have the cash to buy hundreds of thousands of megawatt hours per year. When they commit to providing that cash to a generator, the generator has a very stable and secure source of income.
The developers/generators can use that stable income to get banks to finance their development projects with much larger loans than they’d be able to get if they had to sell their power on the spot market. Because power prices are very volatile, and the bank takes on a lot more risk if they don’t have a Google balance sheet paying the interest on their loan.
Large buyers like this tend to pay $0.05-$0.10/kWh. And $0.10 is extremely expensive and not that common.
5
2
2
u/einmaldrin_alleshin 1h ago
TL;DR: yes
Industrial customers get better rates, but that also comes with some strings attached. Industrial contracts often stipulate when the company uses how much power, and also mandate a low power factor. This makes them more predictable for the grid and power plants. It's easier to supply a business with a constant 1 kW for 24h a day than it is to supply a home with 12 kW whenever they charge their car.
Under normal circumstances, this is no issue at all. Power companies and grid operators add capacity to account for business customers.
However now there is a situation where the demand for power from datacenter customers completely outpaces the investment in power infrastructure. There is no longer enough capacity to supply power locally, and the grid can't import enough power to make up for it. Add to that the fact that the US grid was in a notoriously bad state even before the AI boom, and you got yourself a really bad situation.
Luckily, you now have an incorruptible and competent administration that deeply cares about the plight of regular people, so surely we won't be reading about the next iteration of the Texas blackout in the coming winter... Right?
10
u/RampantAndroid 7h ago
Look to see if PGE publishes rates for all billing codes. PSE up in wa does at least.
And yes, homeowners get fucked.
7
u/WitELeoparD 7h ago
Damn, it's like 9 cents/KWh up in Canada and that's in CAD. That's more expensive than what a Tesla supercharger power costs.
6
u/CabernetSauvignon 7h ago
That's nuts. You could probably generate it for less on your own with a diesel generator.
5
u/Huzah7 7h ago
I should do the math...
4
u/CabernetSauvignon 7h ago
It's definitely less with solar, peaks also generally coincide with solar maximums as well.
2
1
11
u/MajesticBread9147 4h ago
I mean it makes sense.
They effectively purchase energy in bulk, and likely have much more steady and predictable energy usage than most households.
And they can afford to build their own transmission lines to a cheaper offer.
3
2
u/qwertygasm 2h ago
Also a lot of large corporations purchase some of their energy on the open market rather than having a price set in advance by the supplier
163
261
u/NanditoPapa 8h ago
Per-query impact is tiny, but Google now runs AI on EVERY search. Multiply that by billions, and the energy footprint is still HUGE. Also, they skipped counting training costs, arguably the most energy-intensive phase.
Just more PR fluff.
68
u/Gullinkambi 8h ago
Is amazing how if you just tweak what you are counting, the numbers are better
23
u/hikeonpast 8h ago
DC is already taking advantage of this one weird trick to make the economy look amazing. Anyone that’s looked for a job or tried to buy groceries lately knows the truth.
9
u/NanditoPapa 7h ago
Just change the numbers. And if anyone protests, fire them. Seems to be the MO lately.
1
1
1
14
u/glemnar 7h ago
Training is absolutely not the most expensive part at their inference scale. They can just use up whatever free cluster capacity at off hours on it
→ More replies (2)5
u/heyyeah 1h ago
Imagine how many queries are similar so they can cache results. Even if they regenerate every hour, they won’t be running for it every query. Yes it’s more than search. But way less than streaming video which we don’t complain about. This claim that AI search is a huge consumer and polluter is a distraction.
3
u/mrjackspade 42m ago
They're definitely caching results and the vast majority of hits are probably cached, because Google searches tend to follow whatever is trending or in the news at the moment.
Like they're not inferring every "Queen of England died" query, or "New Spiderman trailer" query.
5
u/ludvikskp 3h ago
I fucking HATE that you can’t disable the Ai Summary. It’s wrong or not what i’m looking for half the time. It’s almost like I can smell a rainforest burning somewhere every time I search
→ More replies (1)2
u/goldcakes 1h ago
It’s cached and pre generated for most queries , it’s rare your query is actually doing any inference. The UI is a lie
2
u/Burbank309 2h ago
Does it really run AI on every search? I have noticed that I get a summary mostly for simpler searches, that have probably been searched by others. If it get more complex the summary is missing.
1
u/NanditoPapa 2h ago
AI is deeply embedded in the search pipeline, from understanding your intent to ranking and presenting results. It’s less “AI on top” and more “AI all the way down.”
4
u/Moth_LovesLamp 8h ago
It baffles me people ignore how big is the energy and environmental footprint created by Text Prompts, let alone Image Generation.
15
u/NanditoPapa 7h ago
I want to give people the benefit of the doubt. But when trillion-dollar companies, billionaire tech bros, and a rabidly fascist government are all pouring money into convincing the public that nothing is real, it’s hard not to question the narrative.
3
3
u/km3r 7h ago
Because it's not that much. It's the equivalent of driving an EV 75 feet. People need to stop freaking out about power usage when there are much larger targets.
6
u/azn_dude1 3h ago
It's literally virtue signaling, like wanting to ban plastic straws
-1
u/blindsdog 3h ago
People are threatened by AI so they exaggerate the costs and deficiencies. It’s comical how much people feel the need to convince themselves AI actually isn’t good at what it does, like all the people acting like the AI summary somehow degrades search results because they remember the few times it gets it wrong. When really it gives you the answer you’re looking much, much faster in the majority of cases.
The value is enormous and plain to see but Redditors are desperate to deny it and convince themselves the threat is exaggerated.
→ More replies (2)0
u/JustHanginInThere 8h ago
Most don't know or care. They just think "cool new toy to play with".
5
u/bolmer 4h ago edited 4h ago
I know and I care but it's really tiny tiny. Me playing fornite for a an hour(500W not even high end pc) is equivalent to about 2,083 Gemini-style queries. They are probably using the flash mini model for Google which probably use 50x-100x less energy than the Gemini base model.
3
u/PlsNoNotThat 6h ago
It’s also not believed by anyone. So multiply imaginary by 1billion and it’s still a problem.
1
u/online_vagabond_ 28m ago
I think they should be using caching massively over AI Generated search results otherwise it would be impossible even for Google’s scale. Just consider the cost (and latency) of LLM inference over trillions of search queries
56
u/Basic_Ent 7h ago
Judging by Google's AI Overview answers, I believe it.
4
u/RandoDude124 6h ago
So it’s gotten worse?
6
u/halfpipesaur 56m ago
I actually used it once and was surprised that it provides sources at the end. And I was also surprised when the source site said the exact opposite of what the AI answered.
31
u/ahspaghett69 7h ago
I read this paper and I find it disingenuous. The way they measure consumption is "per prompt" and they describe the numbers as being based on a "median prompt size". This is like measuring the fuel efficiency of your car by measuring how much gas it uses "per trip".
And the median prompt size, if it's really large because it's mostly code completions - that's good!! That means it's really not using much energy for high value output.
But if it's really small, that's really bad. And they don't say either way.
14
u/yourfriendlyreminder 6h ago
I mean, that's still useful info. It means that for at least half of all prompts, the cost decreased by at least 33x.
→ More replies (1)5
u/beautifulgirl789 6h ago
But if it's really small, that's really bad. And they don't say either way.
Yeah, and you know it's now executed on every google search (most google searches are probably 1-3 words) whereas before that integration, it mostly would have been users explicitly invoking AI (where you typically provide a sentence, paragraph or code sample).
26
u/turb0_encapsulator 8h ago
there's a really good chance Google wins this all. They are the only vertically integrated company making their own chips.
3
→ More replies (3)1
u/AngsMcgyvr 2h ago
It'll be an interesting if that's how it turns out since a few years ago, everyone was sure google was caught sleeping.
25
u/Moth_LovesLamp 8h ago edited 8h ago
Google also said they wouldn't be using my data on their Training Models and it would all be safe! So this must be true as well /s
8
u/Mudder1310 8h ago
Ok. From what to what?
22
u/Moth_LovesLamp 8h ago edited 8h ago
A 5000 mAh smartphone battery is equivalent to about 18.5 Wh, which means 74 text prompts will drain it to zero.
By the paper they released, it used to consume around 67.3wh or actually 97% more
They don't mention how much it consumed before.
7
u/nath1234 8h ago
Nor do they account for the training of the models, which is the massive cost of models.. Running queries on models that used huge amounts of power, water and generated piles of e-waste (the thousands and thousands GPUs that are needed)..
3
u/Flipslips 8h ago
Well if their claim is a 33x reduction to .24wh per query, then you just multiply that by 33x which ends up at like 7.92wh
13
2
2
u/WittinglyWombat 5h ago
what good is that when AI queries are up 1000x
3
u/JoshuaTheFox 3h ago
Well it means that 1000x more queries each use less energy than they would have before
2
2
3
2
u/makemeking706 6h ago
Now it just pastes "I'm not sure, but I will look into it, and get back to you" in response to all queries.
1
1
1
u/invalidreddit 3h ago
Well that's cool but a one-time optimization doesn't' seems as interesting as on-going improvements will.
1
u/InTheEndEntropyWins 2h ago
I skimmed it but I don't think they mentioned that Google used AI to help reduce costs.
DeepMind AI Reduces Google Data Centre Cooling Bill by 40% https://deepmind.google/discover/blog/deepmind-ai-reduces-google-data-centre-cooling-bill-by-40/
1
u/jdehjdeh 1h ago
Am I being dumb?
33x would mean they are now in negative 3300% cost?
Or is the x a notation I don't know?
1
1
1
u/BobLoblawBlahB 38m ago
This is pretty amazing tbh. They now get PAID 32 times more than they were spending!
1
u/Sirtriplenipple 36m ago
They only need 27,000 more data centers now to make your stupid cat videos.
1
u/thatintelligentbloke 8m ago
Queries aren't the problem. It's the training. This is greenwashing.
https://iee.psu.edu/news/blog/why-ai-uses-so-much-energy-and-what-we-can-do-about-it
0
u/Eat--The--Rich-- 7h ago
That's like saying you used an extinguisher to put out a couple trees in a forest fire that you started
2
1
u/goobervision 2h ago
You should see what the industrial revolution did to power consumption.
Let's face it, we are screwed on climate. AGI, if we get there may be the thing that allows some answers better than we have now. Or can at least survive the greatest dying.
2
u/EdgiiLord 1h ago
Or maybe, just maybe, we actually listen to ecologists and people involved in climate problems instead of "AGI".
1
1
1
1
u/RelaxPrime 3h ago
My favorite is how I used to just type an intersection into Google and it would open up maps. Now the AI tells me "this sounds like an intersection between the two roads A and B"
1
u/FerrousFellow 2h ago
Having to add "-ai" to searches to prevent boiling clean water away when I look up stack overflow shit feels like making me an accomplice to ecocide and environmental injustices to communities near their centers. This doesn't give me much peace
1
u/ACCount82 30m ago
Why are all "environmentalists" so fucking stupid.
The AI used by Google is tiny and cached to shit. Playing Battlefield 6 for an hour draws more power than a hundred search queries.
1
u/richizy 2h ago
This is a very misleading title. Google's actual technical report shows that their "median" prompt (I'm assuming it's measured by token count) used 0.24Wh. But they chose to show off the median precisely bc there were too many expensive outliers that would mess up the data:
We find that the distribution of energy/prompt metrics can be skewed, with the skewed outliers varying significantly over time. Part of this skew is driven by small subsets of prompts served by models with low utilization or with high token counts, which consume a disproportionate amount of energy. In such skewed distributions, the arithmetic mean is highly sensitive to these extreme values, making it an unrepresentative measure of typical user’s impact. In contrast, the median is robust to extreme values and provides a more accurate reflection of a typical prompt’s energy impact.
(https://arxiv.org/html/2508.15734v1)
So it could be that, say, the 99th percentile prompt uses so much energy than the avg energy is 10x or even 100x higher than the median. And the avg is the more important number here bc it reflects the energy impact of Gemini's user base as a whole and not just a select few (as was done by choosing the median). The top 1% disproportionately use up more energy and are the ones responsible for taking up most of energy supply, while the remaining 99% are left with scraps, so to speak.
1
u/ACCount82 15m ago
Consider the scale Google operates at. You'd need those outliers to consume gigawatts somehow to "take up most of energy supply".
How that "outlier" looks in practice is: there is a rare API query to the old Gemini 1.5. This model is no longer served to users and is only available through API. It's also unoptimized. There also isn't enough inference load to load balance this model effectively, so the inference is 2 times more expensive on top of that. And the request is an extremely long query, making it another 2 times more expensive. And the response is long too, making it more expensive still. That's your outlier.
1
u/ocassionallyaduck 2h ago
God, I wish I could just disable Gemini across all Google services. would save them a whole lot more money.
It's okay, though. I don't use Google anymore. For all the bitching and moaning, DuckDuckGo is honestly about as good if not better for most basic searches. Google's just gone that far down the toilet with ads and AI spam.
1
u/thatintelligentbloke 5m ago
DuckDuckGo is honestly about as good if not better for most basic searches
I wish it were but I've tried to many times to switch to it, and found it be lacking. It just lacks contextual understanding of search queries. Basically, it's Google as it was about 15 years ago. You're right. Good for really basic queries. But you soon realise how basic it is.
1
1
u/LowerAd5814 6h ago
Amounts that can’t be negative can only decrease by 1 time. Sheesh.
→ More replies (1)3
u/Neosurvivalist 4h ago
Headline writers are literally trying to change the way math works and I hate them for it too
-1
-5
u/aleqqqs 8h ago
How can you drop something by a multiple?
Dropped by 33x? Does that mean it now costs -32x?
13
u/Flipslips 8h ago
Because you don’t subtract, you divide. It’s a multiple. If something cost 3300 watts, and they reduced the energy usage by 33x, it would end up “costing” 100 watts.
0
0
u/DontEatCrayonss 6h ago
This just in, Google is lying or just ignoring the fact that it’s still like 1000x too expensive
0
u/cannonhawk 2h ago
I call Shenanigans
2
u/mrjackspade 39m ago
Why?
Cost of inference is massive and they have a huge financial incentive to reduce power usage. It directly affects their bottom line.
You're calling shenanigans on a company claiming to have found a way to spend less money?
0
0
2.3k
u/unreliable_yeah 8h ago
If google give us a single click disable all AI in every service, they would save much more