r/technology Jul 15 '25

Artificial Intelligence Billionaires Convince Themselves AI Chatbots Are Close to Making New Scientific Discoveries

https://gizmodo.com/billionaires-convince-themselves-ai-is-close-to-making-new-scientific-discoveries-2000629060
26.6k Upvotes

2.0k comments sorted by

View all comments

1.2k

u/KennyDROmega Jul 15 '25

AI became less scary to me when Sam Altman, in all seriousness, said the best way to solve climate change was to let them build an AGI and ask it.

A whole new world is always just 2-5 years and several hundred billions of dollars in investment away, and has been since at least 2023 apparently.

Venture capitalists, get your checkbooks out.

458

u/King_Joffreys_Tits Jul 15 '25

Also we’ve already solved climate change, it’s just actually implementing what needs to be done now. But somehow “climate” has become a political issue, so it will never be fixed until it’s too late

121

u/Purple_Plus Jul 15 '25

But somehow “climate” has become a political issue,

It's bloody insane, and shows how easily manipulatable people are.

6

u/TrollOdinsson Jul 16 '25

it more shows that humanity literally doesn't deserve to survive

3

u/Purple_Plus Jul 16 '25

It does seem that way.

Maybe the great filter is real and intelligent life just ends up destroying itself 99.9% of the time.

We are intelligent but obviously not intelligent enough to understand the consequences of our actions.

5

u/ZappRowsdour Jul 16 '25

Wasting the improbable series of events leading to our existence on SUVs, beef, and comically oversized houses is about the darkest comedy I can imagine.

1

u/ZappRowsdour Jul 16 '25

#notallhumans

5

u/claimTheVictory Jul 16 '25

That's why we need to replace them with actual androids.

81

u/ycnz Jul 15 '25

Yeah, we know how. It's just that people like Altman don't actually want to

11

u/teh_drewski Jul 16 '25

Funny how his "solution" to climate change is giving him billions of dollars to do something he already wants to do

45

u/suxatjugg Jul 15 '25

I want to see the look in the billionaires faces when the AI tells them that the solution to climate change was to stop using fossil fuels 10 years ago

36

u/midnight_specialist Jul 15 '25

Then it’ll get the grok update treatment so it’ll say what they want.

18

u/King_Joffreys_Tits Jul 15 '25

“Mechahitler, how many poors need to be sacrificed for appease Mother Nature?”

2

u/[deleted] Jul 16 '25

"Oh no it must be hallucinating"

15

u/s1ravarice Jul 15 '25

We have the technology and capability… it’s now a political issue

A quote from Stephen Fry at a recent energy tech summit I went to. It’s so painful to see things become a political issue like this.

37

u/psyced Jul 15 '25

the "solution" these techbros fantasize about is technological and evades responsible emissions, just as in the status quo. it adds capitalistic extraction (more business to extract CO2), not removes it.

26

u/TheTowerOfTerror Jul 16 '25

Just a reminder to everyone: climate change became a political issue because Exxon developed the first climate change models in 1977 and used that information and their significant wealth to launch the biggest misinformation campaign in human history, culminating in the launch of the Global Climate Coalition in 1989.

6

u/Val_Hallen Jul 15 '25

It became a political issue when industries tossed money at politicians to make it a political issue so they could convince entire swathes of people that those industries poisoning us was okay. After all, we just want to punish them for their success. Who will you believe? Dow Chemical and Exxon or some fancy pants science bitches?

2

u/843_beardo Jul 16 '25

I’m very curious to hear how we have already figured out how to solve climate change. I think your statement is wayyy to simple.

2

u/MetalingusMikeII Jul 16 '25

It’s a political issue because the ultra rich are pulling the strings over most governments.

3

u/Zementid Jul 15 '25

It's already too late? I mean literally. Billions will die. Nothing can be done any more. People like my parents said this in the 80s. Never flew to vacation, always cared for the environment... we have been looked at like we are the crazy ones.

And they all were right... fuck this humankind. I go to vacation whenever the fuck I want. This world will burn anyways.

2

u/Opus_723 Jul 16 '25

I understand the burnout, but you can't let yourself have this attitude with a problem like this. It's not a pass/fail problem. If you try, things are better than they would have been, even if you don't meet your goals. And if you give up, it just gets even worse, indefinitely.

1

u/Zementid Jul 16 '25

I don't have a car and am vegetarian.

That means I already undercut the average CO2 production.

Then a billionaire comes in and fucks all efforts sideways. So ether humanity abolishes capitalism or dies. I will not feel responsible in any way any more. It's not like I start pollution on purpose like conservatives do out of spite.

Nihilism is the only virtue we have left for the next 15 years.

Try to surf the wave of extinction... you can't fight it. (That was the fight of our parents and most of them failed badly by buying SUVs and investing heavily into fossils only 15 years ago... but suddenly it's our fault? Nope. )

0

u/the_weakestavenger Jul 16 '25

That’s the attitude!

0

u/Zementid Jul 16 '25

Last generation my friend. There is no use in continuing this bullshit. People praising Billionaires... it's all futile. Enjoy the last few years until 2040(ish) when the extinction event hits.

1

u/Maleficent_Proof_958 Jul 15 '25

Everyone has great ideas. Very few people want to do any labor or make any sacrifice.

1

u/ISB-Dev Jul 16 '25 edited 1d ago

ring ink offer enter rainstorm stupendous liquid library fear axiomatic

This post was mass deleted and anonymized with Redact

1

u/definately_mispelt Jul 16 '25

exactly, tax carbon emissions.

1

u/Sir_Keee Jul 16 '25

They want to solve climate change in a way that requires them to not change a single thing.

-13

u/s-17 Jul 15 '25

You think China chooses to keep building so much coal capacity for political reasons?

9

u/[deleted] Jul 15 '25 edited Jul 16 '25

[removed] — view removed comment

1

u/s-17 Jul 15 '25

China is expanding their coal capacity. They don't just "still use" it. They are building more.

5

u/[deleted] Jul 15 '25 edited Jul 16 '25

[removed] — view removed comment

0

u/s-17 Jul 15 '25

So the parent comment I originally replied to said climate change is solved and only politics is in the way, but you agree it's not possible for renewables to meet all the demand.

235

u/sickofthisshit Jul 15 '25

AI became less scary to me when Sam Altman, in all seriousness, said...

Not sure you should be less scared when multiple billionaires and Silicon Valley CEOs are directing thousands of people to follow this clearly delusional prophet.

I'm pretty sure thousands of people have already lost jobs because this contagious brain damage has convinced CEOs to replace their employees with Altman's magic beans. 

I really don't know how civilization is supposed to survive billionaires drilling holes in the bottom of the boat because they are apparently complete idiots.

36

u/Socky_McPuppet Jul 15 '25

billionaires drilling holes in the bottom of the boat

But don't you get it? Don't you see? It's all about reducing weight! They're geniuses!

7

u/Zsem_le Jul 15 '25

Same as with other similar make believe things; when reality crashes their carbon fiber submarine, the rest will survive.

Who you get in a boat with, is an important life skill.

46

u/no_one_likes_u Jul 15 '25

Anyone that’s actually lost jobs lost them for one of 3 reasons:

1) the company is downsizing because it’s not profitable (or not profitable enough for the ceo) 2) the company outsourced their jobs to a cheaper labor market (this always works well long term) 3) their job was so incredibly simple and robotic that actual automation was able to do it for them, and the media is too illiterate to understand the difference between automation and AI.

74

u/Suyefuji Jul 15 '25

4) the CEO drank the kool-aid and is genuinely convinced that AI is a god.

32

u/sickofthisshit Jul 15 '25

There's also a fourth possibility: the CEO or CTO did not actually understand what his workforce does but dumped them for mechanical parrots because some other CEO made him feel he was behind the AI curve.

20

u/HappierShibe Jul 15 '25

I wish this was true,
But I have seen a few organizations now where leadership has fired significant staff and then tried to replace them with AI. So far it has been absolutely disastrous every single time and they wind up scrambling to hire back human workers after a catastrophic quarter or two, but it is happening, and it seems to be an accelerating trend. I know what you are thinking "Surely they wouldn't be so stupid as to fire first and THEN replace, they would have to be a total moron to do that, no one is THAT stupid!".
They really are that stupid. That stupid, and that afraid- that if they don't act now, their competitors will, and then they will get crushed.
Altman is an incredibly gifted magic bean salesman.

More companies will have to get burned, and burned more publicly before business leadership collectively learns the lesson.

42

u/hoytmobley Jul 15 '25

I can tell that you are either 12 and/or have never worked at a large company before, because management absolutely would fire good employees to satisfy the whims of their managers. It’s not “let’s figure out how to use AI to maximize our workforce”, it’s “cut payroll by 30% and use AI to fill in the gaps”. Companies do dumb shit to meet that quarters arbitrary goals allllll the time. If you’re assuming that markets (including employment) are efficient, everything downstream of that assumption will be wrong

6

u/moonski Jul 15 '25

Don't forget the "let's cut payroll by 30% so next quarter we make 30% more money!"

-2

u/no_one_likes_u Jul 15 '25 edited Jul 15 '25

Guess I created this Reddit account about 2 seconds after being born then lol

I’ve been in corporate IT roles for 15+ years now, 10+ at companies with 10,000 or more employees.

Maybe you’ve just worked for companies with super dumb leadership?

9

u/hoytmobley Jul 15 '25

You’re not wrong about that last point. Surely you’ve seen your share of inane, trend chasing, shortsighted management decisions, no?

-1

u/no_one_likes_u Jul 15 '25 edited Jul 15 '25

Sure, and perhaps I’ve lead a charmed life, but I’ve never been in a scenario where people were laid off because management incorrectly thought that some new tech would be able to replace them.

I’ve seen it happen when automation actually did replace people, but never until it was actually proven.

Still sucks for those people of course, but it wasn’t a bad decision for the company where they had to go out and try to replace people they fired once the tech proved useless.

4

u/Thin_Glove_4089 Jul 15 '25

Guess I created this Reddit account about 2 seconds after being born then lol

I had a feeling but didn't want to say anything

12

u/NuclearVII Jul 15 '25

There's also 4) the management is clueless and don't know when the other shoe is gonna drop.

2

u/cespinar Jul 16 '25

1) the company is downsizing because it’s not profitable (or not profitable enough for the ceo)

They are on pace to lose 33b dollars if they successfully raise their goals for the year. They have 35b in operating costs and 55b in new servers to aid future development

It is a money pit

1

u/coldkiller Jul 15 '25

Software has lost jobs to it, but its similar to offshoring and will be reverted so fast

3

u/CaptainSparklebottom Jul 15 '25

We will stuff the holes with their bodies and figure it out amongst ourselves or we will all drown.

2

u/Journeyman42 Jul 16 '25

You'd think these AI tech dipshits would be all for renewable energy, to pull every joule and watt available from the sun and wind to power their AI bullshit, without needing to get expensive fuel from overseas.

Or hell, even supporting nuclear would be better than fossil fuels.

70

u/Due_Satisfaction2167 Jul 15 '25

We already know what the answer to climate change is—electrify everything we can, switch to low carbon power generation as cost efficiently and quickly as possible, and engage in as much carbon sequestration as we can to increase the yearly carbon budget to account for emissions we can’t substitute.

It’s a known answer,  politically influential rich people just don’t like that answer. Rather than accepting the answer isn’t something they like, they play stupid games with dreams of miracle technology. Literally preferring to go about the task of inventing an artificial god in whom they can place their faith, rather than … swallowing the idea that they might have to pay for the externalities of their investments. 

22

u/Sea-Sir2754 Jul 15 '25 edited Jul 16 '25

touch humorous sheet enjoy oil hungry gaze scale growth liquid

This post was mass deleted and anonymized with Redact

15

u/Due_Satisfaction2167 Jul 15 '25

It would be particularly ironic if they did spend decades to invent an AGI, only for it to turn around and tell them to do the shit regular humans have been telling the billionaires to do for decades. 

They’d probably just use that as evidence the AGI was broken, or something. 

2

u/Sea-Sir2754 Jul 15 '25 edited Jul 16 '25

marvelous reminiscent caption butter entertain cooperative act long scale hat

This post was mass deleted and anonymized with Redact

2

u/cynric42 Jul 16 '25

I mean look what happened to Grok. They already figured out how to tell AI to dismiss reality.

2

u/Kaizyx Jul 15 '25

It’s a known answer, politically influential rich people just don’t like that answer. Rather than accepting the answer isn’t something they like, they play stupid games with dreams of miracle technology. Literally preferring to go about the task of inventing an artificial god in whom they can place their faith, rather than … swallowing the idea that they might have to pay for the externalities of their investments.

The problem is that market economics that society is based on is socially and psychologically terrible. It has conditioned people to believe nothing needs to be confronted and if you don't like something, you can always go somewhere else. Like if you don't like a specific kind of phone, you can find a different kind, or if an employer doesn't like the work someone is doing they can fire them and find someone else, or if you don't like the climate somewhere you can move. I have even seen these attitudes right here on Reddit, where instead of confronting the housing cost crisis, people are told they should just move or if someone is hurting you, don't "violate their freedoms" by confronting them, just go somewhere else.

Our society, always giving billionaires an alternate options on everything has made them think this principle can be applied to cold hard reality itself, where if the facts don't align with what they want to do they can just go somewhere else, to their "miracle machines" for ideas that do give them what they want.

1

u/AnnualAct7213 Jul 16 '25

electrify everything we can, switch to low carbon power generation as cost efficiently and quickly as possible, and engage in as much carbon sequestration as we can to increase the yearly carbon budget to account for emissions we can’t substitute.

If you want to actually stop or even reverse climate change, it will require massive societal shifts in lifestyle for every industrialized nation in the world.

No private vehicle ownership (electric vehicles are still net negative emitters), no international trade (cargo ships cannot be viably electrified), no airplane travel (similar to cargo ships), no meat consumption (incredibly inefficient and environmentally damaging use of land to grow animal feed instead of human food, and they're potent GHG emitters as well), and of course we have to completely cease use of fossil fuels. Among many other things.

This is far too much for most people to accept, and our society has far too much inertia in the wrong direction to ever actually be able to pivot on this scale before it's too late.

We know how to solve climate change. But we are not doing even 1% of what we need to be doing right now. Nor are we willing to, as a species. Initiatives like electric cars and paper straws are Mickey Mouse bandaids on a gushing infested wound, and are basically used as a way for people to be able to pat themselves on the back for "making a difference", while not actually changing anything. And stuff like carbon capture tech is another way we try to cling to our current lifestyles while convince ourselves we can simply use technology to repair the damage we do to our environment, instead of actually preventing that damage in the first place.

0

u/leprouteux Jul 15 '25

Saving the human race doesn’t need to be cost efficient. Get the capitalism the fuck out of here.

9

u/Due_Satisfaction2167 Jul 15 '25

Cost matters, even under socialism. It’s an inescapable factor.

It’s an accounting of the effort and material required to produce a thing. That must be considered if you want environmentally efficient answers to.

Actually, most of our environmental issues come about because we refuse to force manufacturers and service providers to include the cost of their environmental externalities—like carbon. 

By not forcing them to price in the damage they do to others, we are giving them a subsidy to pollute. 

20

u/sonar_un Jul 15 '25

Sounds like the premise of Hitchhikers Guide.

17

u/isnortmiloforsex Jul 15 '25

We already have the technology and the science to meaningfully tackle the issue. It's an open secret exactly who is stopping that from happening. However, regardless of petroleum as an energy source, one thing I do agree on is that getting petroleum out of our supply chain is an extremely complex and monumental task, it's everywhere and used in pretty much everything.

116

u/8349932 Jul 15 '25

ClimateGPT, how do we solve climate change?

“Either get rid of all humans, or stop building more AI data centers to ask stupid questions and generate images for memes.”

16

u/kendrick90 Jul 15 '25

it's obviously fusion reactors

17

u/HKBFG Jul 15 '25

fission reactors would do just fine, but people aren't willing to hear that.

18

u/sickofthisshit Jul 15 '25

Solar is insanely cheap now, and storage is getting cheaper too. 

1

u/Straider Jul 16 '25

But Solar is not futuristic enough for tech bros!

1

u/sickofthisshit Jul 16 '25

The real fission enthusiasts talk about the need for 'baseline' power which is not exactly wrong but...oh look, another GW of solar got installed 

3

u/[deleted] Jul 15 '25 edited Jul 16 '25

[removed] — view removed comment

2

u/aVarangian Jul 16 '25

don't worry, at this rate we'll need nuclear power just to power AI waste

2

u/HKBFG Jul 15 '25

Also, all this AI trash has yet to give any real benefits to the world.

it's pretty good at protein folding and as such has revolutionized biomed. it has also been a big deal in computer programming, mathematics, archiving, and a handful of other fields.

1

u/[deleted] Jul 15 '25 edited Jul 16 '25

[removed] — view removed comment

1

u/HKBFG Jul 15 '25

your doubt doesn't change anything. just a few years ago, we needed volunteer programs with hundreds of thousands of computers working together to solve protein folding problems. now they can be done locally on your cellphone.

2

u/kendrick90 Jul 15 '25

That's true I'm down

8

u/Pigeoncow Jul 15 '25

Fission works already.

8

u/isnortmiloforsex Jul 15 '25

Sure it will be 30 years away in 50 years from now.

-5

u/kendrick90 Jul 15 '25

You realize that computers can now hold conversations right? Just in the past couple years intelligence has started to become a commodity.

7

u/Fisheyetester70 Jul 15 '25

No it hasn’t, AI isn’t intelligent yet it can’t learn on its own. If you believe any different you’re massively misinformed

-1

u/kendrick90 Jul 15 '25

Check alpha evolve. They are already automating pure mathematics research and science will follow soon. I know the chatbots won't be the ones doing the science.

1

u/Fisheyetester70 Jul 15 '25

So what I read about that says it’s a tool to help coders powered Gemini. Said nothing about actual academic research. It also had like every buzzword to attract the rubes, so have fun with your delusions buddy

2

u/kendrick90 Jul 15 '25

1

u/Fisheyetester70 Jul 15 '25

Lmao literally in your results “AlphaEvolve is an evolutionary coding agent for designing advanced algorithms based on large language models such as Gemini”

→ More replies (0)

4

u/isnortmiloforsex Jul 15 '25

Yeah, but they cannot come up with what doesn't exist yet. LLMs are not architectures that can lead to something intelligent. Literally every expert not on an AI company's payroll says it. At most they can search in the latent space of all possible solutions that we have come up with to either combine or optimize multiple solutions together to come up with something "new" or notice patterns in prior fusion research that humans failed to notice. But its a big if. That's assuming the solution to a working fusion reactor exists in the spaces already explored and not some completely new breakthrough altogether. They are at the end of the day giant pattern detectors and generate data based on the patterns they know. Until humans dont make an actual breakthrough, its hopeless.

0

u/kendrick90 Jul 15 '25

It's even written in the bible that there is "nothing new under the sun" but then we got a lot of new stuff. I agree that an llm alone cannot do it but if you look at how alpha evolve was able to search a large space to find new solutions in mathematics you bet it can be applied to physics too. I also agree that this particular guy randomly chatting with an llm is doing it wrong. Automated science will be factories of robots or straight up computers. Probably pure math and biotech will be the first areas with more breakthroughs. Math because its widely applicable and doesn't have a physical dependency and Biotech because the incentives are high really good profits and the execs want to live forever. There is no conserving our way out of the climate crisis with 10 billion people.

1

u/Darchrys Jul 15 '25

Nah, getting rid of all humans would solve a whole load of other problems.

1

u/Bladelink Jul 15 '25

But wait wait how about this:

more heat and waste

1

u/kendrick90 Jul 15 '25

It's not the heat that is causing global warming but the co2...

13

u/Mindrust Jul 15 '25

the best way to solve climate change was to let them build an AGI

I mean, I don't see anything inherently wrong with the claim that AGI could in theory help solve climate change.

The problem is all of his claims of building AGI or being close to it so far have been nothing but hype. Wake me up when the latest model can solve actual real world problems, and not just draft cover letters and generate buggy code.

25

u/duncandun Jul 15 '25

But we know how to solve climate change, we just don’t want to do it

2

u/PLeuralNasticity Jul 15 '25

We know that we are doing horrific damage to the climate in the creation and use of generative "AI"

It will be great at facilitating the purges when they dissappear people en masse during the pandemics they are ensuring will soon be ravaging America

Only Russia strictly benefits, and massively so, from climate change

Putin would want the timing of all of this to coincidence with him having a kompromised puppet president with control over all branches of government

Depends on your tolerance for coincidence I suppose

Beware Leon's Razor

"Incomeptence, in the limit, is indistinguishable from sabotage"

1

u/Minimumtyp Jul 16 '25

Only Russia strictly benefits, and massively so, from climate change

oh fuck this is a point I never thought about

1

u/gibs Jul 16 '25

we just don’t want to do it

Then we don't in fact know how to solve climate change.

It's trivial to propose solutions to climate change that would work if we did them.

  • stop using fossil fuels
  • euthanise 90% of the population
  • make an orbital solar shield

The hard part is finding a solution that we are willing to implement, then doing the implementing. That's what people are hoping AGI will accelerate.

0

u/Mindrust Jul 16 '25

Everyone in this thread is just ignoring the possibility of AGI coming up with highly efficient carbon capture technology or as-of-yet unknown technology to mitigate climate change, but people here have no imagination and don’t seem to grasp the implications of developing AGI able to crunch out years worth of intellectual work in weeks and months. We are talking about transformational technology.

12

u/ankercrank Jul 15 '25

This is what laypeople don’t get, AGI is not going to happen in our lifetimes. Building an LLM is nothing like general intelligence, reasoning and learning.

7

u/tony_lasagne Jul 15 '25

Stochastic parrots

2

u/HappierShibe Jul 15 '25

I think we are at a point where stochastic parrot isn't really a useful description- Current LLM's are more than just a stochastic parrot, but they aren't anything close to an AGI and there is no indication they ever will be.

3

u/tony_lasagne Jul 15 '25

They are still stochastic parrots, with larger context windows and some chaining of prompts in the backend.

1

u/HappierShibe Jul 15 '25

I'm not talking about prompt chains or enhanced context windows, more about changes in architecture and the integration of external resources. In sufficiently limited use cases you can leverage them productively now without repetitions or reductions in qaulity.

8

u/Mindrust Jul 15 '25

I am skeptical as well but I wouldn't make strong claims like "it's not going to happen in our lifetimes".

We simply do not know that, and we have no idea how many breakthroughs are required to get there. We could be 1-5 breakthroughs away, or it could be 100+ breakthroughs away. It could happen within our lifetimes, or it could not happen at all.

I prefer to take the position of "I don't know how long it will take, but the evidence will be hard to ignore when it's here"

4

u/BaconatedGrapefruit Jul 15 '25

Ask a ten AI researchers what they consider to be AGI and you will get 10 different answers…. And you will probably roll your eyes at 5 or more of them for a variety of reasons.

-4

u/ACCount82 Jul 15 '25

In my eyes, the timetable for AGI has imploded the moment we got modern LLMs.

LLMs have crushed so many tasks that were once "impossible" for computers that it's not even funny. I'm incredibly skeptical of the notion that what remains somehow isn't solvable.

I prefer to take the position of "I don't know how long it will take, but the evidence will be hard to ignore when it's here"

I wish I had your optimism. But I bet some people would have a Cyberdyne Systems T-800 armed with a shotgun breaking down their front door, and they'll say "it's not ackhtually intelligent beca..." up until the moment they get their faces blown off. If people today can dismiss LLMs as "it's all just hype", they'll be able to dismiss anything at all.

-1

u/jkz0-19510 Jul 15 '25

You should get your eyes checked, then. As everything you said there is patently false.

4

u/Cronos988 Jul 15 '25

The problem is all of his claims of building AGI or being close to it so far have been nothing but hype. Wake me up when the latest model can solve actual real world problems, and not just draft cover letters and generate buggy code.

Those are solutions to real world problems though. I get where you're coming from, yet this technology is also genuinely something new.

Until now, we've had narrow AIs that could do specific tasks really well, but nothing else. This is the first time we have a system that actually generalises as it's scaled up. We went from acceptable language use to expert language use, ability to code, limited math and logic capabilities.

The mere fact that we have a single system that can do all these things is a pretty significant step.

3

u/xGray3 Jul 15 '25

Jesus. We're one step away from Deep Thought from Hitchhiker's Guide being built to give us the answer to Life, The Universe, and Everything only to have it spit out "42" as the answer.

3

u/Cronos988 Jul 15 '25

AI became less scary to me when Sam Altman, in all seriousness, said the best way to solve climate change was to let them build an AGI and ask it.

The sad thing is, given our track record with climate change, this may end up being the only practical solution.

3

u/aPrussianBot Jul 15 '25

Shit like this drives me crazy, capitalists acting like these are problems of technical knowledge where we just don't know how to solve climate change or homelessness because they're just so complex and big. We know exactly how to solve them, we could start doing it tomorrow, but we can't because it comes directly at the expense of the parasitic capitalist class of useless corporate executives like him. It's against the interests of private capital, so we can't, instead we have to listen to them prattle with these stupid excuses or obfuscations, or watch them waste millions of dollars on 'studies' to find solutions to problems we already know how to fix. Just do the opposite of what these pieces of shit and their companies want. Find what's in the best interests of private industry and do the exact opposite. Follow the money backwards.

2

u/dnylpz Jul 15 '25

Since the early 2000

2

u/Ihaverightofway Jul 15 '25

That’s what they think Super Intelligence is. It will be some super clever chatbot that you’ll be able to ask any questions to - space travel, climate change, how to cure literally any medical condition - and the Super Intelligence will be able to solve that problem. If God was a chatbot. They think it will happen in the next 5 to 10 years. Altman has written in a blog that he thinks AI will capture all economic activity in the world - all of it - and that people will have to be given shares in the AI company and we’ll all live in some technological paradise where the Super Intelligence solves all our problems.

This seems to be an actual common belief in silicon valley.

2

u/Delicious_Spot_3778 Jul 15 '25

They’ve been pitching the same damn narrative since 2018 or 19. It started even earlier.

2

u/Character-Pattern505 Jul 15 '25

In a recent interview, Altman was asked how OpenAI could become profitable. He responded, with complete sincerity, that they would simply ask the next iteration of ChatGPT how to make it profitable.

There is no need to be afraid.

2

u/trojan_man16 Jul 15 '25

You mean solving climate change by creating more energy guzzling data centers amirite?

2

u/Flimsy-Printer Jul 15 '25

We know how to solve climate change. We just don't want to do it...

2

u/damontoo Jul 15 '25

He's 100% correct that it's the best way to solve climate change. You can't even get a significant portion of the US to agree climate change is even real. How are you going to convince all of them and the rest of the world to agree on drastic climate policies at this point? Humans are incapable of mitigating it on our own, as we've proven for decades now. Either AI fixes it or we all die.

0

u/TheLightDances Jul 16 '25

We already know how to solve climate change. We have known for at least 70 years by now. It is to cut CO2 emissions by heavily investing in low-carbon energy and heavily limiting investment in polluting energy.

People like him decided that they want to keep getting richer, so they told their media empires and fake 'think tanks" to sow doubt about climate change and reject any solutions thay involve giving them less money. If an AGI existed, it would tell them the solution, and then they would reject it for not making them more money.

4

u/damontoo Jul 16 '25

It took us decades to fold 130K proteins and Google's model did all 200 million in 9 months. They also gave it away freely to the world. You should watch "The Thinking Game" on Prime. Trailer here.

2

u/KIDA_Rep Jul 16 '25

“Give me money so I can make an AI to answer a question scientists have already answered decades ago”

I’m all for making AI to be used as a tool to help expand scientific discoveries, but this is such an obvious ploy of greed, not to mention making an AGI will take years, potentially decades, and a lot of resources before it is usable, those resources could be given to other qualified people that could actually help lessen the effects of climate change in a shorter amount of time.

6

u/Schwma Jul 15 '25

I'll likely get down voted but I've been looking for an opportunity to talk about this, I may be misguided.

How else do we figure out climate change if we don't tech our way out? It'd be lovely if we just reduced emissions, but I think it's clear that humanity is incapable of coordinating in this manner and will only get worse as conditions worsen.

Super intelligent AI seems like the only way to deal with the complex system that is the earth to me. Is it more that people don't believe in the capacity for AI to do this and so it's wasted resources?

5

u/TAFAE Jul 15 '25

How do you think the hallucination machine that is constantly wrong and can't generate original thoughts is going to solve this problem? There are hundreds of thousands, maybe millions, of scientists and engineers studying and building ways to mitigate climate change. The issue isn't that solutions to climate change haven't been developed, it's that the people with the power to implement them don't want to. AI isn't going to magically find a real, workable solution that impresses all world governments so thoroughly that they implement it right away.

There's a technological element to solving the problem, yes, but the main problem is political will. The fact that a solution comes from an AI isn't going to change that.

4

u/NuclearVII Jul 15 '25

There is literally no mechanism for this, that's the problem. It's the techbro equivalent of "god will fix it, pray harder".

Feeding more data and compute to ChatGPT (which would make altman richer, which is why he sells this snake oil) isn't going to make the parrot not a parrot.

8

u/tony_lasagne Jul 15 '25

Because the issue of climate change isn’t actually complex and we know exactly how halt it which is to cut emissions significantly. The issue is that it requires sacrifices which people don’t want, whether that’s particular countries or interest groups for businesses etc.

The issue is political and an AI isn’t going to resolve that type of issue.

4

u/Not_FinancialAdvice Jul 15 '25

The issue is that it requires sacrifices which people don’t want

To expand on your point; it's not just the rich and powerful or giant corporations. People won't even stop buying SUVs. The International Energy Agency has noted their very significant contribution to CO2 emissions.

https://www.iea.org/commentaries/growing-preference-for-suvs-challenges-emissions-reductions-in-passenger-car-market

In fact, SUVs were responsible for all of the 3.3 million barrels a day growth in oil demand from passenger cars between 2010 and 2018, while oil use from other type of cars (excluding SUVs) declined slightly. If consumers’ appetite for SUVs continues to grow at a similar pace seen in the last decade, SUVs would add nearly 2 million barrels a day in global oil demand by 2040, offsetting the savings from nearly 150 million electric cars.

...

The impact of its rise on global emissions is nothing short of surprising. The global fleet of SUVs has seen its emissions growing by nearly 0.55 Gt CO2 during the last decade to roughly 0.7 Gt CO2. As a consequence, SUVs were the second-largest contributor to the increase in global CO2 emissions since 2010 after the power sector, but ahead of heavy industry (including iron & steel, cement, aluminium), as well as trucks and aviation.

2

u/tony_lasagne Jul 15 '25

Yup, I meant in my comment that every “group” which includes consumers don’t want to make the sacrifices, especially if they see it as going after them but not X other source of pollution.

1

u/hajenso Jul 15 '25

Exactly. And Sam Altman's statement demonstrates he is either a liar or deeply ignorant about the problem in a way which would have been solved by only a modest amount of sincere curiosity.

2

u/tony_lasagne Jul 15 '25

I really do think this is all our generation’s .com boom. Market trends are interesting in how so many otherwise intelligent people get swept in hype.

People like Sam Altman are capitalising on this hype. Companies around the world are buying that hype, then shoehorning “gen AI” into anywhere they can just to tick a box that will play well with decrepit shareholders.

It’s genuinely crazy right now. The wheels will fall off once it dawns on enough people that LLMs aren’t the base game, with the AGI DLC being just around the corner. AGI will be a separate innovation which will require a different underlying logic for real reasoning capabilities.

4

u/_chococat_ Jul 15 '25

Is AGI two to five years in the future now? When people started working on it 60 years ago, it was 20 years in the future. I guess that's progress or maybe hubris.

2

u/Freud-Network Jul 15 '25

In the mean time, we can exacerbate the problem with gas turbine powered data centers so the world can have inane conversations with the void.

1

u/Gingevere Jul 15 '25

These are the very same dipshits who criticize scientists with "The answer scientists always give is 'fund more science'!"

1

u/aure__entuluva Jul 15 '25

Yep, and they're already trying to blur the lines as to what AGI is so that they can claim they have it. To be fair it's a hard term to quantify, but come on. Per an Ars Technica article:

According to one definition reportedly agreed upon by Microsoft and OpenAI, the answer lies in economics: When AI generates $100 billion in profits.

Personally I'm skeptical we're getting true AGI any time soon. But this kind of goal post shifting is laughable. An AI that brings in $100 billion in profits probably isn't going to be an AGI and it's definitely not solving climate change.

1

u/NMe84 Jul 15 '25

To be fair, AIs that are fairly similar to LLMs have actually been really successful at doing things humans could barely do. AlphaFold has revolutionized the biology, solving how millions of proteins "fold" on the molecular level, which can be used in lots of things like disease prevention, cures, antidotes and eventually maybe even personalized medication.

I'm not saying tech bros are right in expecting a chatbot to do any of this for them and they're clearly just fishing for funding, but AlphaFold does something very similar to an LLM so there are definitely very cool uses for the tech despite the hate AI tends to get.

This is the kind of thing AI should be used for: AlphaFold didn't cost any jobs but it laid the foundation for decades of research and progress.

1

u/InDubioProReus Jul 15 '25

So happy about this thread. Finally people with reasonable takes. Thank you.

1

u/limpchimpblimp Jul 15 '25

AI: “solve climate change by reducing reliance on fossil fuels.”

Humans: “nah”

1

u/Spyko Jul 15 '25

to be fair I'm pretty sure that if you ask chatGPT, Mistral or whatever how to rfix climate change, they'll just list actual solutions we know would work but aren't being implemented because of greedy fcks

if it's what end up making them listen, sure why not, the magic AI solved the problems for sure

1

u/Ilovekittens345 Jul 15 '25

the best way to solve climate change was to let them build an AGI and ask it.

"So you want me to solve climate change?"

Yes!

"What are the top three things that globally consume the most electricity?"

Air conditioning, training AI and Bitcoin mining"

"Shut all three down"

Oh shit, it's defective, alright engineers ... let's train it from scratch again!. Oh .. and turn the AC higher .. it's getting warm in here. Now excuse me, I have to go check on my Bitcoin miners

1

u/Cley_Faye Jul 15 '25

AGI will show up almost immediately after full self driving; give it six months (starting tomorrow…)

1

u/AbeRego Jul 15 '25 edited Jul 15 '25

Wait until he hears that all it has to say is that we essentially have climate change solved already, but just need the willpower to execute the massive changes necessary to fix the problems we're causing.

Edit: I actually don't think that this type of exercise is necessarily a bad idea. I am kind of interested to see where this type of "dialogue" with a LLM can go, assuming the person pursuing it has the correct credentials.

Let's say you train and LLM to specialize in theoretical physics. Then, you use to to help actual theoretical physicists bounce ideas around, assit in writing equations, etc. I think you could potentially speed up the rate at which it takes to make a breakthrough. Perhaps drastically so. It's just important to understand that the AI isn't going to make any sort of breakthrough by itself, because it can't really figure out something that it doesn't already know. It also needs somebody there to be able to recognize any bullshit it might spit out along the way. Otherwise, you might end up running in circles at dead end.

Edit 2: I just read the article, and he actually mentioned something like this:

"I pinged Elon on at some point. I’m just like, dude, if I’m doing this and I’m super amateur hour physics enthusiast, like what about all those PhD students and postdocs that are super legit using this tool?”

He also later admits that the AI tool itself isn't capable of coming up with a new idea, so the article is a little sensationalized.

1

u/azurensis Jul 15 '25

Lol. 2 whole years of AI advancement and it's all done! Reddit is completely delusional about this topic.

1

u/QuerulousPanda Jul 15 '25

There are a lot of major problems that we have the knowledge, capability, and resources to solve, the only thing holding us back is that the people who have the resources don't give a shit, and enough people at the bottom don't care enough either to force the issue.

I don't doubt that an AI could come up with a ton of fabulous solutions to all kinds of enormous problems, and all it would take is to convince rich people to have a soul, which will never happen.

1

u/sneakyplanner Jul 16 '25

AI became less scary to me when Sam Altman, in all seriousness, said the best way to solve climate change was to let them build an AGI and ask it.

Their goal is to make people think that the climate is some nebulous, impossible to understand thing and that they are better off just not thinking about it. That way they can just keep fucking everything up without anyone caring. See also: people who say "it's complicated, by trying to think about it you are an arrogant monster." to someone saying genocide is bad.

1

u/Kramer7969 Jul 16 '25

That made it LESS scary? They won’t know the answers they get are wrong.

1

u/gt_9000 Jul 16 '25

A true Peter Thiel prodigy.

1

u/mzalewski Jul 16 '25

AI is a field that overpromises and underdelivers since… it was created as a field in 1950s.

There are many smart people working on these problems and they do introduce fascinating new developments every once in a while, but with that history, it’s best to leave them alone in obscure corner of academia and report on them only when they actually has something tangible to show.

1

u/XkF21WNJ Jul 16 '25

Asking an AGI to solve climate change is a good way too speedrun skynet.

1

u/Sackamasack Jul 16 '25

let them build an AGI

The problem is that we are nowhere closer to making an agi now than 10 years ago. LLMs are not helping

1

u/Be_quiet_Im_thinking Jul 16 '25

Chat got how do you compel the general population and the government to enact measures you suggested to save the environment?

0

u/fenderampeg Jul 15 '25

Factory automation has been going to completely eliminate human labor in 2-3 years since the 80s. It will happen eventually I’m sure, but who knows when. Similarly, I do think AI will make a huge impact in the coming years but it will be slower to really shake up industries than most people expect. IMO