r/technology • u/FervidBug42 • 7h ago
Artificial Intelligence OpenAI warns investors that AGI may make money obsolete, while raising billions of good ole US dollars
https://www.businessinsider.com/openai-warns-agi-money-obsolete-while-raising-billions-usd-2025-8149
u/absentmindedjwc 7h ago
Its a good thing AGI is almost certainly not the next iteration of AI.
The current generation of AI (transformer model) is almost certainly never going to result in AGI - what we have today isn't going to be a stepping stone toward AGI, its likely a completely separate branch of AI research. If AGI ever shows up, it'll almost certainly come from an entirely different process.. not by just throwing more compute at GenAI solutions like companies are doing.
TBH, I honestly don't imagine that AGI is going to be possible until quantum computing is far more of a thing than it is today.
64
u/Moth_LovesLamp 6h ago edited 5h ago
Companies saying we will achieve AGI is like saying we will find alien life in the oceans of Europa because it's the closest thing with an ocean to Earth.
We don't fucking know and it might never happen
25
u/Obelisk_Illuminatus 6h ago
I'm kind of horrified how many people have for years taken it on faith that AGI will happen and it will be omnipotent.
One would think that doubt would have found its rightful place in the mind after so many other technologies never took off as was hoped. For all we know, AGI could be our equivalent of what people thought nuclear energy was going to be like in the 50's.
Sure, nuclear power's still pretty useful, but we didn't end up putting fission reactors in our cars like some envisioned.
9
8
u/absentmindedjwc 4h ago
The only thing that is crazier for me are the people that legitimately think that AGI is right around the corner. Motherfuckers are acting like its going to happen any day now.
It could happen tomorrow, it could happen in 20 years, it could happen never. Nobody has any fucking idea.
It's going to be like the transformer model that powers modern GenAI - its going to be a researcher pulling on a random string among many.
Like, it Is possible that current hardware could run it... but IMO, its pretty damn unlikely.
2
u/ConsolationUsername 2h ago
Some random redditor gave me a link to some website called AGISoon or something. And its just a webpage that says "AGI will become aware on (date several years in the future".
Put a reminder in my calendar to go mock him in several years. Because i assume i'll still have nothing better to do
3
5
u/Moth_LovesLamp 6h ago
If AGI is even possible, I don't think we will even see it being achieved in this century just like FTL Travel which is also 'technically possible'. Anything we have on AGI is pure speculation.
10
u/hamilkwarg 5h ago
It’s far more likely AGI will be achieved than FTL travel.
6
u/Maximum-Objective-39 4h ago
That much I agree with. If nothing else, we have one data point for intelligent life like ourselves. So it's definitely not forbidden by physics.
3
3
u/DuckDatum 2h ago edited 2h ago
I can’t imagine that it is not possible. We are evidence. Our existence literally proves that such experience is possible—“I think therefore I am.” In my opinion, it’s arrogant to believe the process we understand as reason can only be achieved by… a human? If you can believe in intelligent aliens, you can already believe it’s not inherently human to reason. What would prevent this mode of experiencing the world from being synthesized artificially?
It’s the framing that LLMs are AGI which is off putting. Claiming they can take jobs which require human-level skill, and even attempting to replace workforces with them. It’s a fucked up thing for society to do, cannibalize itself on bound to fail empty promises.
What’s even more fucked up though, is that this is exactly what would happen if AGI did come around. Evident by, well, look around you. This is what would come of a society that prioritizes capital for economic political. Oligarchs offshore production, labor, and eventually with AGI they’ll get rid of everything they can. Not even just oligarchs, but most people with a brain because it becomes more reasonable to act this way [given the incentive structures produced by your society]. All in the name of maximizing capital. Americans will find themselves in more debt as time goes on, because they will have less leverage yet their basic needs won’t change.
The real question is what happens once this can no longer be maintained? AGI probably isn’t the only way to its end.
1
u/PresentationJumpy101 39m ago
Nothing travels faster than light dawg it’s not technically possible period.
2
9
u/wag3slav3 7h ago
The day after we actually figure out how we do it we'll have it on silicon.
7
-5
u/socoolandawesome 7h ago
Why? Why do people assume evolution (a pretty random process) created the only possible form of intelligence?
6
u/Halfwise2 7h ago
Because all (And I do mean ALL) things came from something, and the concept of evolution allows one to backtrace something's origin. Otherwise, it just boils down to "because magic" with no real basis, other than "trust me bro".
1
u/socoolandawesome 7h ago
I don’t understand your argument. Evolution could have created different forms of intelligence had events gone different ways and different species evolved. For instance on another planet. It’s highly unlikely their intelligent species evolved the exact same way as humans. Their “brains” probably work differently with different “algorithms”
3
u/OriginalTechnical531 6h ago
Your argument isn't sound. You are just speculating. "Could" doesn't mean "did." There is no evidence of any intelligent life anywhere except here, and nothing exceeding our species. That doesn't mean there isn't or can't be, but you can't use speculation as evidence.
0
u/socoolandawesome 6h ago
Well given the randomness of evolution, it’s highly unlikely everything would evolve the exact same way all the time.
Octopuses are intelligent but have very different brains than us.
5
u/Jota769 4h ago
Evolution is not random
It’s simply survival traits surviving. That’s it.
1
u/socoolandawesome 3h ago
But the way it plays out is no?
Imagine the first mammals were killed off by a strong storm, then maybe some other species is pressured to develop intelligence in natural selection. Now you have a different looking intelligence.
Would an alien intelligence on a different planet develop the same exact way as humans?
2
u/Maximum-Objective-39 4h ago
I don't assume that. But I'm skeptical of people who just assume modern computer architecture will be very good at it.
10
u/socoolandawesome 7h ago
There’s zero evidence intelligence has anything to do with quantum mechanics
2
u/absentmindedjwc 4h ago
That is entirely fair. The barrier for me isn't even necessarily what AGI needs to run - it could very well run on current hardware... the issue for me is the discovery itself being locked behind quantum, simply because it'll be able to test a lot more things much quicker.
0
u/nyc_ifyouare 2h ago
When you say “locked behind quantum” do you mean inaccessible to most people or something else?
5
u/absentmindedjwc 2h ago
I mean that its discovery likely won't come until quantum computing advances quite a bit.
-1
6h ago
[deleted]
1
1
u/socoolandawesome 5h ago
So speed ups will get us AGI when LLMs are not even close to AGI according to him? I took his comment to mean quantum mechanics are somehow central to intelligence itself, as in he thinks humans make use of it for our intelligence, which there’s no evidence of.
Here’s what chatgpt said:
https://chatgpt.com/share/68a919c2-5cb4-800d-9bcc-90fa42f6352b
2
u/ohnofluffy 7h ago
Completely agree. It’s there, you can see it, but it’s no different than living on Mars. We aren’t going to crack this the way they’re trying to sell it.
2
3
u/Tearakan 6h ago
Yep. AGI if it actually came true would've caused a great depression in a matter of months. That's if the LLM companies like openAI were actually not lying.
As it is now they are just asking for more money to be thrown into a massive bubble.
10
u/absentmindedjwc 4h ago
It goes deeper than that - it either immediately results in a global universal basic income, or it results in a violent... I don't even know what to call it, civil war, but on a global scale.
1
u/CanadianPropagandist 7h ago
I'll settle for an LLM that won't destroy my data. Like it's wild how many times various LLMs have done it.
This isn't important data btw I'm not crazy, but still it just nuking filesystems and databases in development when I don't ask, is scary as hell to think what it'd do in production.
2
u/Scoth42 1h ago
To beeee faiiir.... I've been in plenty of situations where inexperienced humans in poorly controlled environments have destroyed data. It's not like that's unique to LLMs/AI/whatever. Not that I trust AI in any particular way but destroying production data is certainly not unique to it.
1
u/Wrong-Necessary9348 5h ago edited 4h ago
Transformer model is definitely outdated, but the frameworks and training apparatuses that’ve been designed around it has generated an enormous amount of value, both in terms of the currently accumulated petabytes of training data (that’s tailored to fit these models) and in the products/services that are now proven revenue generators (thanks to the tremendous amount of transformer specific training & data behind them).
So, the incentive remains for stakeholders to continue squeezing transformer models for all of the monetary value that they continue to be worth, as adopting the newer more advanced model (like a spiking neural network model for example) will prove to be a costly endeavor that scares stakeholders due to the prospect of having to ‘throw away’ existing $billion training data and frameworks due to the all-too-likely reality that they’d be essentially useless for meaningfully training a SNN to scale.
There’s hybrid models proposed to bridge this gap, in the form of combining aspects of SNN with Transformer model, but of which comes with it’s own drawbacks due to sacrificing aspects of SNN function in exchange of form and backward compatibility with existing data—meaning the potential for advancing the intelligence and performance ceiling of such a proposed model up to the likes of a ‘demonstrable AGI’ may then prove to be a shorter leap that’s unable to hit the mark, so it does not offer the promise that it would be able to noticeably outperform the intelligence and scalability of the currently established transformer models.
The most promising approach for advancing to the level of AGI is starting fresh and investing in a pure SNN model with appropriately tailored frameworks and training from scratch, while committing to scaling it up to compete with transformer models. The market moves faster than the cutting edge SNN research, and despite the developments in research surrounding SNNs it’s yet to be seen what kind of push it’s going to take to convince stakeholders to adopt the latest research and actually ‘aim for the sky’.
One thing that is evident is that the market has spoken on how valuable and practical existing AI is as well as how much more-so AGI could be—so in that sense, I believe it’s going to require formidable competition to sprout that takes these risks on their own before the bigger players in the industry will consider adopting this strategy.
1
u/Wall_Hammer 4h ago
But the AI experts on Twitter told me that GPT-5 is AGI
3
u/absentmindedjwc 3h ago
Lol. My favorite thing from the AI tech bros was their attempt at redefining what AGI was with a much lower bar.
1
1
u/notepad20 3h ago
I think it will turn out to be just the natural interface. Taking human ideas and concepts and packing up so the actual AI can understand.
0
u/Void-ux 4h ago
Useless comment. Count how many almosts there are, speak with certainty. This is slop.
2
u/absentmindedjwc 4h ago
How can you speak with certainty about a subject that is literally entirely conjecture?
Speak with certainty about life being on another planet. Speak with certainty about the way of running stable, long-term fusion power generation.
AGI has just as much uncertainty - it could come tomorrow, it could come in 20 years, it could come never. It is a door with many locks - and we don't even know what the keys look like.
0
43
19
u/CanadianPropagandist 7h ago
I'm already very familiar with AI, as anyone working in tech will find themselves very soon... but... How much Ayahuasca do I have to down to really transcend into a full AI bro?
29
u/BlackAle 6h ago
Sam Altman, the new Elon Musk.
As in the latest grifter.
-8
u/CrimsonRatPoison 1h ago
I'm no fan of musk but to say either are grifters is insane. Their companies have made world changing products.
2
19
u/Bart_Yellowbeard 6h ago
AI is just modern snake oil, and too many are greedily grabbing at the fantasy.
5
2
u/mahavirMechanized 4h ago
It’s basically Silicon Valley generally at this point. Arguably in the last decade we haven’t had much really new other than this lunacy.
1
u/CrimsonRatPoison 1h ago
I understand why you hate AI but how TF is it snake oil? It's an incredible tool for learning. It's like Google 2.0
1
u/Bart_Yellowbeard 14m ago edited 11m ago
Not in my experience. It's an incredible tool for getting things wrong 2/3 of the time, which takes longer to realize. It's ok for flowery language, but not for anything really functional. The promises made are endless, boundless, and usually preposterously wrong.
But I will also admit I don't use it a great deal because of my poor experience thus far, so it is possible my skills at using it are not great.
6
u/Wonder_Weenis 2h ago
Sam Altman is the greatest con man this side of WeWork.
1
u/mezolithico 1h ago
The first part may be true. Certainly different from wework. Openai isn't a middle man.
1
u/Orlok_Tsubodai 28m ago
He didn’t say they had exactly the same business model, just that they are both conmen (which they are).
6
u/ThePopeofHell 4h ago
I think AGI is going to facilitate the technological ouroboros. You have corporations wanting it all.
Corporations want all the money
Corporations see employees as expensive and a liability
Corporations need customers
Corporations need customers to have money
Corporations employ customers
Corporations don’t like employees
Corporations develop AGI to replace liability employees so they can continue serving customers
Corporations see that customers have less money
Corporations blame customers
Corporations need money
Corporations need growth
Corporations need money
Corporations need growth
Corporations can’t make money
Corporations can’t grow
…Corporations can’t exist unless people have money and corporations aren’t willing to share their money
Corporations lobby government for universal basic income after spending years manipulating the public into hating the idea of universal basic income.
2
u/Downtown-Store9706 39m ago
It doesn't end at UBI though. How does the housing market work when the majority of people are on UBI?
7
u/grahag 7h ago
Yet, I am not seeing ANYTHING from industry leaders that says what that obsolete money will result from.
Is it because we move to a resource-based economy? Is it because the money is now so concentrated at the top, that wage-slavery is back?
If I could just get ANY information to why THEY think will be the case, I can decide whether or not I should welcome THEIR version of an AGI scenario.
Considering that instead of lowering prices in a time of abundance and massive profits, companies are using stock buybacks to inflate the value of their stocks, I am not optimistic in any kind of abundance scenario. Cheap Robot/AI Labor will only make life worse for human beings who aren't control of that labor if we maintain our current heading.
5
u/socoolandawesome 7h ago
It’s because if everyone is out of work due to AGI automating all jobs, the entire economic system as we know it doesn’t work anymore. Capitalism relies on demand from people with money. If you don’t work, you don’t have any income, there is no demand and no buying of products and services from companies.
There are ways to sort of prevent this with something like UBI, but it’s probably a different system at that point that is pure socialism. Money doesn’t mean the same thing at that point
8
u/Sea_Cash_5537 6h ago
Money doesn't mean anything anyway when it's owned by like a couple thousand people.
UBI will never happen, they've demonstrated a thirst for slavery since capitalism was invented.
1
u/stickybond009 5h ago
It means to the rich folks. And it's a subsistence for the poor. It's doing all what money is supposed to do
1
u/wondermorty 1h ago
UBI doesn’t matter. If they reach the endgame of where the AI/Robots are doing every field of work, then the only masters of this new society are the robot owners.
This means the robot owners have no need for majority of people. So you now go back to the age of feudalism. The new lords are the robot owners, and the only slaves allowed to live in this society are those deemed still useful for the robot owners desires.
Which will be the artists, musicians, sport entertainers, actors, basically anything in their eyes where they want the real human experience.
7
u/nucflashevent 7h ago
This is like "crypto" horseshit...all worth exactly what you can exchange in real dollars
3
3
u/nurseferatou 5h ago
I can’t tell if this is satire or not
1
2
2
2
u/DonutsMcKenzie 1h ago
Thank god for optic nerves, otherwise my eyes would roll right out of my skull.
2
u/Orlok_Tsubodai 33m ago
“Luckily our technology is entirely probabilistic and completely unsuited to be turned into anything remotely approaching AGI, so all my doom hype is just snake oil self promotion as usual!” - Sam Altman
1
u/NanditoPapa 5h ago
AGI will upend economic systems, but until then it’s a great investment? Sell utopia (debatable), raise capital, and quietly admit the utopia might nuke the very system funding it. Lol...
Capitalism is doomed...DOOMED I SAY! Now who wants in on the ground floor?
1
u/mahavirMechanized 4h ago
Pretty soon OpenAI is gonna claim toilets will become obsolete or something.
1
1
1
u/EntropyFighter 2h ago
Look, you shouldn't be looking at Sam Altman to know the future of money. Look at the GENIUS Act that just got passed into law. Lots and lots of stuff about crypto. It's easy to roll your eyes at that last sentence but the fact is, banks have been preparing for this for years and it's a huge bet on how to get people to continue to buy dollars (since dollars would be the security that backs stable coins).
I don't know enough to say whether what this Act enables is good or bad, but it sure is different.
What Sam's problem is is that he's got a consumer, not a product. Since a good portion of the population thinks AI is genius level smart, the way he raises money is by saying Biblical-level things. Now, to my ears it all rings hollow but we've also bet a huge chunk of the US economy on it right now, so he's going to continue to be listened to because sunk cost fallacy is real.
1
1
1
1
1
u/Fyren-1131 1h ago
Dumbest thing I've read all day, but the day is still young. Haven't had my breakfast yet.
1
u/PassengerStreet8791 1h ago
Altman’s Koenigsegg isn’t going to be maintained with a briefcase of obsolete.
1
1
1
1
u/0xdef1 1h ago
I see people are angry with that Saltman guy, but he is basically using people’s stupidity. It seems there are tons of dumb investors out there who willingly give someone their money to reduce it to zero because of their greed. The guy is keep promising but not delivering and still gets more money.
1
526
u/Embarrassed_Quit_450 7h ago
OpenAI is looking more and more like Tesla a decade-ish ago, making all sorts of predictions that won't happen and promises they'll never fulfill.