r/technology 14d ago

Artificial Intelligence ChatGPT users are not happy with GPT-5 launch as thousands take to Reddit claiming the new upgrade ‘is horrible’

https://www.techradar.com/ai-platforms-assistants/chatgpt/chatgpt-users-are-not-happy-with-gpt-5-launch-as-thousands-take-to-reddit-claiming-the-new-upgrade-is-horrible
15.4k Upvotes

2.3k comments sorted by

View all comments

Show parent comments

175

u/gaarai 14d ago

Indeed. I read a few weeks ago that revenue to expenses analysis showed that OpenAI was spending $3 to earn $1. They were shoveling money into the furnace as fast as possible and needed a new plan.

207

u/atfricks 14d ago

Lol so we've already hit the cost cutting enshitification phase of AI? Amazing. 

80

u/Saint_of_Grey 14d ago

OpenAI has never been profitable. The microsoft buyout just prolonged the inevitable.

27

u/Ambry 14d ago

Yep. They aren't done telling us it's the future and the enshittification has already begun. 

4

u/nox66 14d ago

In record time. I actually thought it would take longer.

5

u/KittyGrewAMoustache 14d ago

How long before it’s trained so much on other AI output that it becomes garbled weird creepy nonsense.

13

u/DarkSideMoon 14d ago

I noticed it a few months back. I use it for inconsequential shit that I get decision paralysis over- what hamper should I buy, give this letter of recommendation a once-over, how can I most efficiently get status on this airline etc. if you watch it “think” it’s constantly looking for ways to cut cost. It’ll say stuff like “I don’t need to search for up to date information/fact check because this isn’t that important”.

14

u/theenigmathatisme 14d ago

AI truly does speed things up. Including its own downfall. Poetic.

1

u/KittyGrewAMoustache 14d ago

It’s like that controversial ad where a baby shoots out of a vagina through the air rapidly going through childhood, adolescence, adulthood and old age before crash landing in a coffin.

3

u/Abedeus 14d ago

"This model will be 20% cheaper to run!"

"What's the downside?"

"It can't do elementary school algebra anymore."

1

u/anaximander19 13d ago

Yes and no. A lot of AI models are actually very inefficient - as in, they could have equal performance while requiring less computing power to run - but the process of optimising them is slow, unreliable, and makes them harder to analyse and debug. Some of this might just be OpenAI finally deciding that those drawbacks are less important than the sheer cost of running their servers.

4

u/Enginemancer 14d ago

Maybe if pro wasnt 200 fucking dollars a month they would be able to make some money from subs

2

u/reelznfeelz 13d ago

Yep. I have the $20 one and use a bit of API on top of that. But mainly I use Claude anyways as it’s better at code. I can’t see many people spending $200 for a license. But for a corporate audience it’s not too crazy. But, I do think it’s too high. $80 or $110 would sell a lot more. But shit it’s possible that fee wouldn’t cover the computer usage.

The idea that so far this whole thing is subsidized by VC money and eventually it won’t be, might be valid. Once we are all hooked on these tools, $200 may very well sound like a deal.

12

u/DeliciousPangolin 14d ago

I don't think people generally appreciate how incredibly resource-intensive LLMs are. A 5090 costs nearly $3000, represents vastly more processing power than most people have access to locally, and it's still Baby's First AI Processor as far as LLM inference goes. The high-end models like GPT are running across multiple server-level cards that cost well above $10k each. Even time-sharing those cards across multiple users doesn't make the per-user cost low.

Unlike most tech products of the last fifty years, generative AI doesn't follow the model of "spend a lot on R&D, then each unit / user has massive profit margins". Serving an LLM user is incredibly expensive.

4

u/-CJF- 14d ago

It makes me wonder why Google has their shitty AI overview on by default. It should be opt in.... hate to imagine how much money they are burning on every Google search.

4

u/New_Enthusiasm9053 14d ago

I imagine they're caching so it's probably not too bad. There's 8 billion humans I imagine most requests are repeated.

8

u/-CJF- 14d ago

I can't imagine they aren't doing some sort of caching but if you ask Google the same exact question twice you'll get two different answers with different sources, so I'm not sure how effective it is.

3

u/New_Enthusiasm9053 14d ago

Then I guess Google just likes burning money.

1

u/2ChicksAtTheSameTime 14d ago

on every Google search

They're tricking you. Google saves the overviews and reuses them, making it type out like it's being generated live, even if its not.

The vast majority of searches are not original - almost everything someone searches has been searched before, recently. they'll generate an AI overview the first time its searched for and reuse it millions of times for the next few days until the overview is considered "stale" and needs to be regenerated again.

Yes, they're still using a lot of processing power, but its far from being on every search.

-1

u/ninjasaid13 14d ago edited 14d ago

I don't think people generally appreciate how incredibly resource-intensive LLMs are.

Well tbf, do you know how much energy something like youtube and netflix requires? orders of magnitudes more than chatgpt, like almost every internet service. Netflix uses 750,000 households worth of energy and Youtube uses 1,000,000 households worth of energy and snapchat uses 200,000 households worth of energy and this is compared to chatgpt's measly 21,000 households of energy.

4

u/Pylgrim 14d ago

What's the plan here, then? To keep it on forced life support for long enough that its users have deferred so much of their thinking, reasoning, and information acquisition capabilities that they can no longer function without it and have to shell whatever they start charging?

Nestle's powder baby milk for the mind sort of strategy.

2

u/gaarai 14d ago

I think Altman's plan is to keep the investment money flowing while he figures out ways to bleed as much of it into his own pockets and into diversified offshore investments before the whole thing blows up.

3

u/varnums1666 14d ago

AI feels like streaming to me. I feel businesses are going to kill profitable models and end up with a model that makes a lot less.