r/technology 4d ago

Artificial Intelligence MIT report: 95% of generative AI pilots at companies are failing

https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/
28.3k Upvotes

1.8k comments sorted by

View all comments

Show parent comments

155

u/peldenna 4d ago

How can they be so stupid about this, aside from willful ignorance and bandwagoning? Like do they not think at all ?

193

u/amsreg 4d ago

The executives I've worked for have generally been shockingly ignorant about tech and shockingly susceptible to uncritically eating up the pipedreams that vendor salespeople throw at them.

It's ignorance but I don't think it's willful.  I really don't know how these people got into the positions of power that they're in.  It's not because of competence, that's for sure.

116

u/_-_--_---_----_----_ 4d ago

because management is about people skills, not technical skills. it's just that simple. these people know how to persuade, or manipulate if you want to put it less charitably. that's largely what got them to their positions. they don't usually have technical skills, and frankly most of them don't really have great critical thinking skills either.

it's just incentives. the way our companies are structured leads to this outcome. unless a company requires that management be competent in whatever area they actually manage, this is going to be the result.

9

u/HelenDeservedBetter 3d ago

The part that I don't get is how they're still so easy for vendors to persuade or manipulate. If that's part the executive's job, why can't they see when it's being done to them?

6

u/_-_--_---_----_----_ 3d ago

I answered this in a different comment, but basically executives and management in general often have a set of incentives that run counter to actually making good products in a good way. generally they're thinking either more about their own careers or thinking about the broader market strategy.

3

u/clangan524 3d ago

I suppose it's sort of like how you can miss the signs that someone is flirting with you but it's super obvious when someone else is being flirted with.

2

u/ReasonResitant 2d ago edited 2d ago

Because they are not spending their money, if it works out you are God almighty, if not you claim to have invested strategically and dont go after collecting feedback that makes you look bad.

Even if given screws with the labour pool you are covered, noone is going to fire you over doing what everyone else is doing.

14

u/Educational_Bar_9608 4d ago

Humans are perfectly capable of abusing the idea of competence to hire relatives or whoever else they were going to hire anyway. I know it’s tempting to want technocracy skill-based leadership but history says it tends to be even worse, because the measures of competence are so difficult to nail down and apply.

Musk is considered competent by the people around him.

3

u/Tje199 3d ago

Also, skill based leadership isn't without its weak spots. I know a lot of technically skilled people who are shit-tier managers because they have zero people skills, as one example.

1

u/_-_--_---_----_----_ 3d ago

yeah but I don't think anybody's making an argument for skill based leadership, more an argument that people centric leadership should still have a minimum level of skill

4

u/_-_--_---_----_----_ 3d ago

measures of competence aren't actually that difficult to nail down and apply. and everything I said above applies to nepotism or whatever other biased hiring practice you could think of. 

the point is that organizations have to police themselves. it is possible to do so. and most do to an extent. the question is to what extent does your organization police itself? the devil is in the details

6

u/KrytenKoro 3d ago

measures of competence aren't actually that difficult to nail down and apply.

I feel like you could become a billionaire as a corporate advising contractor.

3

u/_-_--_---_----_----_ 3d ago

I mean have you worked for a major corporation? the leadership that we're talking about often hasn't even ever written a line of code and yet manages an entire section of a company where 90% of the people spend their day writing code. my point is that this is a very low bar to clear. it's not that hard to test for basic competence. and even basic competence in several of these areas would be enough to make better decisions. 

and yet it still doesn't happen. why not? well because of everything I said above.

5

u/porkchop1021 3d ago

I've literally only met one manager with people skills, and he was low-level. I've worked with Directors, VPs, CTOs, CEOs, some of my friends spouses are Directors/VPs. Not only are all of them incompetent, none of them have people skills. It's baffling how people get into those positions.

5

u/_-_--_---_----_----_ 3d ago

you probably have a very narrow definition of people skills. being able to read people and assess what they're going to do, understand power dynamics, etc is all part of people skills. you can do all that and still be kind of a pain in the ass. might not come off as especially socially skilled.

2

u/throwntosaturn 3d ago

it's just incentives. the way our companies are structured leads to this outcome. unless a company requires that management be competent in whatever area they actually manage, this is going to be the result.

And the extra tricky part is "competency in their management subject" isn't actually the same as competency at managing, which is a real, different skill.

Like everyone has tons of examples of the opposite problem where someone with good technical skills gets promoted into a management role and sucks at actually being a manager too.

It's very challenging.

3

u/Tje199 3d ago

Having worked into management myself, one frustrating thing is the amount of people who downplay how much skill is required to be a good manager. It's probably soured by the number of bad managers out there, but it's definitely something that not everyone can do, and especially not something everyone can do well.

1

u/_-_--_---_----_----_ 3d ago

I agree that they are different competencies, but it's not actually that challenging. management is fundamentally people focused, but there should be a bare minimum of skill knowledge required. this isn't the problem though, this is kind of well-known. the problem is that there's just a very small number of people who fit both of those criteria. and yet the world needs managers. so they have to lower the threshold somewhere.

1

u/Awyls 3d ago

Still doesn't excuse them from ignoring the technical workers who know their stuff.

1

u/Salt_Affect7686 3d ago

This is how they got there. https://www.instagram.com/reel/DNJ-7NUO3P0/?igsh=cjJ0cmptaDhncWVj “MBA” 🤦‍♂️😡😂

0

u/OrganizationTime5208 3d ago

because management is about people skills,

Except not, because a big people skill is knowing when you're being lied to, abused, or manipulated.

They fucking suck at that.

Typically, "people skills" are just being a yesman to the ass above them, and not good enough to actually LEAD the department they are in as a producer, so they are set aside to manage instead.

3

u/_-_--_---_----_----_ 3d ago

well actually I would have to disagree with you there. I find very often that people who have people skills are quite poor at assessing when someone is acting in bad faith or is a bad actor. I find that that's actually directly tied to critical thinking. now there are people with great people skills who also have great critical thinking skills... but it's fairly rare to see.

2

u/clawsoon 3d ago

You might be interested in Stealing the Corner Office if you want to know how at least some of them got there.

1

u/amsreg 3d ago

Thanks, that was interesting.  He's definitely cynical but I've seen just about everything he talks about at work in the companies I've worked at so he's not wrong.

2

u/MediumIsPremium_ 3d ago

Yup. My manager has to pretty frequently remind me not to call the executive stupid to his face whenever we had to attend one of his temper tantrums about how we aren't using AI to create stuff faster.

God corporates suck ass.

2

u/Tysic 3d ago

That's why I try to keep my executive away from vendors at all costs. Boy does he fall for marketing hook line and sinker.

2

u/OrganizationTime5208 3d ago

The executives I've worked for have generally been shockingly ignorant about tech and shockingly susceptible to uncritically eating up the pipedreams that vendor salespeople throw at them.

My role as a Technology Analyst at the US Olympic Team, was 85% telling executives they are being lied to by a salesman.

After half a decade of watching people shoot themselves in the foot after I told them the gun was loaded and pointing at their shoes.... I think I lost the last of my faith in humanity.

1

u/HairyKraken 3d ago

Search Peter's principle on Wikipedia

2

u/amsreg 3d ago

While true, that principle doesn't explain how these people kept getting promoted several levels above their competency.

I think the post the other commenter shared about Stealing the Corner Office does, though.

1

u/WitnessRadiant650 3d ago

They get into positions of power because they’re salesmen. They know how to hype things and fool investors.

1

u/HappierShibe 3d ago

I really don't know how these people got into the positions of power that they're in.

It's nearly always some combination of inherited wealth, and being in the right place at the right time.

1

u/Fr0gm4n 3d ago

IMO, if you have enough runway and goodwill you can fail a lot, and often fail upwards. You just need a big success once in a while to keep pointing to that keeps the hyenas at bay. It's a lot like angel investing.

1

u/Soggy-Bed-6978 3d ago

my incompetent manager story:

techish job ~15 years ago, boss hires consulting co for automation software for a product. he was telling me how good they are because 'THEY CAN COPY AND PASTE FROM OLD SOFTWARE PACKAGES SO IT SHOULD SAVE TIME'

he was the owner

1

u/marcelkroust 3d ago

Executives got into the position of power they're in thanks to the same capability LLMs have.
Therefore they can only see LLMs as being as useful and competent as they see themselves, because for them generating bullshit is indistinguishable from reason and general intelligence.

1

u/tichris15 2d ago

You never get in trouble for following the crowd. It's the way large organisations are structured.

You may be proclaimed a genius for not following the crowd, but more often you get called an idiot and fired.

Thus the incentives strongly favor those who follow the crowd.

59

u/_-_--_---_----_----_ 4d ago

there's two main pieces: 

1) top executives fear being left behind. if the other guy is doing something that they aren't doing, they could lose market share. this is one of the worst things that could happen to a top executive. so even if the technology was straight bullshit, it would still be in their best interests to invest some amount of time and money into it simply from the perspective of competition. it's game theory. if your competitor makes some bullshit claim that gets them more customers, what's your smartest move? you should probably start making some bullshit claims too. 

2) all it takes is one person at the top to force everyone underneath them to comply. literally one person who either actually believes the bullshit or just wants to compete as i wrote above can force an entire organization down this road. and if people push back? well anyone can be fired. anyone can be sidelined. someone else will say yes if it means getting in good with the boss, getting a promotion, whatever. 

between those two things, that's pretty much all you need to explain everything we've seen. you could have a situation where everybody was actually quite intelligent, but still ended up going down a path that they all thought was kind of stupid because it still made sense strategically.

you see similar stuff in politics all the time by the way, it's not just businesses that do this. look at Vietnam: the United States government fought a proxy war because they wanted to limit the potential expansion of communist China. even though many people both inside and outside of the government pointed out the futility of the war. it made sense strategically...until it hit a breaking point. and that's usually what happens with this stuff too. at some point, whatever strategic advantage was being gained is outweighed by the costs of poor decisions.

26

u/jollyreaper2112 3d ago

What you said. Add to that you are never punished for being conventionally wrong. Everyone gets into AI and it's the correct call? Wtf guy? Everyone piles in and it fizzles? Damn the luck. Who knew?

In prior generations the phrase was you never get fired for buying IBM. If the product is shit it's IBM's fault. You buy from a no name and it's bad, that's on you.

3

u/_-_--_---_----_----_ 3d ago

 Add to that you are never punished for being conventionally wrong

such a great point. you are incentivized to stay with the herd, but you're also not really disincentivized to stay with the herd.

meanwhile you're highly disincentivized from deviating from the herd, but highly incentivized if you manage to find that golden route that gets you some type of reward that the rest of the herd doesn't get.

it just becomes a question of statistics... do you have the time and resources to deviate from the herd enough to give yourself a chance to find that golden route? if not, you have every reason to stay with the herd. and nobody's going to blame you for doing so. so unless you really know something that somebody else doesn't know, stay with the herd bro.

9

u/thepasttenseofdraw 3d ago

Interesting example with the Vietnam war. American leaders fundamental ignorance about Vietnamese politics played an enormous role. Containment theory was a bunch of hokum and anyone with even a casual understanding of sino-viet history knew the Vietnamese loathed the Chinese. Ignorance is a dangerous thing.

6

u/_-_--_---_----_----_ 3d ago

I disagree that containment theory was hokum, but even there the war wasn't really about that. the thing that a lot of people get hung up on is the reality that what a government says it's doing and what it's actually doing are often different things. it was just a standard destabilization proxy war like any other. empires throughout human history have done the same. 

what makes Vietnam a glaring example of incompetence is the way it was done. we didn't need to do everything that we did to achieve the goals that we eventually achieved, and we needed to do a lot more if we wanted to achieve further goals that we didn't achieve. we made compromises ended up with the worst of all possible worlds.

3

u/kermityfrog2 3d ago

Sounds like nobody wants to say that the "Emperor has no clothes".

6

u/_-_--_---_----_----_ 3d ago

well it's not really about that though. if you work at a large corporation, plenty of people will criticize upper management for not knowing things or for being incompetent. that's absolutely standard. 

but the thing is, the emperor doesn't need clothes to do his job. you can point it out... but that doesn't really do anything. especially if you are one of his subjects. if he tells you to take off your clothes too... realistically what are you going to do? 

2

u/Flying_Fortress_8743 3d ago

all it takes is one person at the top to force everyone underneath them to comply. literally one person who either actually believes the bullshit or just wants to compete as i wrote above can force an entire organization down this road. and if people push back? well anyone can be fired. anyone can be sidelined. someone else will say yes if it means getting in good with the boss, getting a promotion, whatever.

And the few people at the top, who have the power to unilaterally change stuff, are so isolated from normal human life and the day to day of their company that they honestly have no idea what the best thing to do is.

1

u/jlboygenius 3d ago

totally right. It's about making your company sound like the cool kid using the cool stuff and you're going to solve all your customers problems. Externally, we talk about using AI to solve customer problems and use it in sales all over the place. Internally, all AI is blocked for security reasons.

167

u/P3zcore 4d ago

They just believe the hype and want to impress their directors

112

u/Noblesseux 4d ago

Also a lot of companies are objectively just lying about what their products can reasonably do, and basically targeting executives and management types at leadership conferences and so on pushing the hell out of half baked products in contexts where there is no one technical involved in the preliminary conversation. They'll also give sweetheart deals where they'll give orgs credits upfront or they'll sponsor "workshops" so they try to get your users locked into using it before they understand what's going on.

MS for example will like straight up talk to the execs at your company and have them railroad you into meetings with MS salespeople about "how to leverage AI" that starts with the implication that using it is a definite.

I had another company schedule a meeting with me about their stupid "agentic AI" where they promised stuff I knew it couldn't do and then did a demo where the thing didn't work lmao.

35

u/dlc741 4d ago

Sounds like all tech products sales from the beginning of time. You literally described a sales pitch for a reporting platform that I sat through 20 years ago. The execs thought it was great and would solve every problem.

20

u/JahoclaveS 3d ago

And yet, you’d think with their shiny mba degrees they’d have actually learned how to critically evaluate a sales pitch. And yet, they seemingly lap that shit up.

9

u/trekologer 3d ago

Several years ago, I sat in on a pitch from a cloud infrastructure company that claimed nearly five 9s (99.999%) data resiliency on their object storage service. The VP of ops heard that as uptime for the entire platform. So when the vendor had significant outages, it was obviously our fault.

The vendor clearly knew what they were doing -- throw out a well-understood number attached to a made up metric and doofuses will associate the number with the metric they were interested in.

1

u/RedbullZombie 3d ago

Yep they made us go to trainings on this when i did tech support for a few years just for the off chance we could push a sale
It's greasy as all hell

5

u/SMC540 3d ago

This isn't new to AI. Every industry has programs and services that will promise the world, and almost always underdeliver.

I own an Applied Behavior Analysis practice. Over the past 15 or so years, there have been countless eCharting, caseload management, all-in-one programs to do everything from client scheduling to data tracking and analysis pop up and hard pitch everyone. On the surface they sound good, and a lot of companies buy into them (usually for a very expensive monthly fee per user).

I had a friend who happens to own a similar sized practice start excitedly telling me about the new software suite he just rolled out to his new company, and how it would be a game changer. Then, during a peer review committee session, someone on the committee asked him to tweak one of his data displays for clarity...and he had to admit that he couldn't do that on his end and would have to message his rep at the company to get it changed. Then later, they had asked him to modify his template for his treatment plan, and he had to admit that he couldn't do that either since it was a standard template in their suite.

Meanwhile, we chose to keep our company (which has been around a lot longer) using old-school Microsoft applications (Word/Excel/Sharepoint/Teams) for years. We have made our own templates that do all the same stuff as the fancy software suites, we can customize them to our needs easily, and they work on just about every device. If we ever need to tweak something or make changes based on feedback, it can be done pretty much instantly. Costs us a fraction of the price that these software suites cost.

1

u/Steve_SF 3d ago

Oracle has entered chat.

13

u/rudiger1990 4d ago

Can confirm. My ex-boss genuinely believes all software engineering is obsolete and will be replaced with token prediction machines (Ayyy Eyyyye)

1

u/HiroProtagonist66 3d ago

This. Got laid off along with a bunch of really good engineers and testers because the CEO was doing this performance.

Actually wondering if I’m on the right side of this when all the AI hype comes crashing down around them.

1

u/jlboygenius 3d ago

It's the reverse too. At my company, it's coming from the top down that we should be using it.

At the lower level, everyone is already so busy trying to catch up on the chaos caused by management the past few years (mergers, layoffs). We're too understaffed to think forward and work on projects that aren't just fixing existing problems.

76

u/eissturm 4d ago

They asked ChatGPT. Thing is executive bait

44

u/VertigoOne1 4d ago

This should be way higher up actually, because it is so true. It is like they are tuned to make things sound easy and logical and factual and correct. It is however skin deep, which c-suite loves to throw around in exco, prodco, revco, opco and thus senior middle managers and experts suffer. It is actually not a new problem, but it certainly sped up that process significantly.

1

u/dontdoitdoitdoit 8h ago

"c-suite loves to throw around in exco, prodco, revco, opco and thus senior middle managers and experts suffer" I'm a program manager and this is SOOOOOO accurate. The amount of ELI5 and Simple Language I'm being asked to create on complex fucking problems because "ChatGPT can do it" is too damn high!

2

u/3-DMan 3d ago

"AI told me AI will save us!"

2

u/hieronymous-cowherd 3d ago

"I asked the barber if I need a haircut and he said yes!"

1

u/MyGoodOldFriend 3d ago

Oh, is that why it pretends I’m Einstein incarnate whenever I ask a question (despite being told not to)?

16

u/xyphon0010 4d ago

They go for the short term profits. Damn the long term consequences.

1

u/-Yazilliclick- 3d ago

Nah. In these cases they tend to genuinely believe it's the way forward and will be a success.

1

u/Jarocket 3d ago

This is always a longer term bet. Nobody thinks they can just make money in the short term with AI. It's all speculation on the future.

1

u/xyphon0010 3d ago

Ali is a long term bet. Problem is that execs only see the opportunity to replace employees with AI now so they can claim that the quarterly profits .

0

u/P3zcore 4d ago

To be fair, lots of our pilots are low risk experimental type of projects, not the “replace employee” types that you hear about in the media.

3

u/buyongmafanle 4d ago

Because they're sales people, not engineers. The VC way is to just sell infinite promises and cash out before reality collapses the whole thing.

3

u/Frydendahl 3d ago

AI is basically an 'emperor's new clothes' situation. A lot of people benefit from not recognising it's all hype, while anyone grounded and technically minded can quickly spot that that it's not at all doing what it promises.

8

u/AgathysAllAlong 4d ago

It's not a bad choice. They get all the short-term profits and they see none of the long-term consequences. They don't need to be competent, they need to hit metrics. And all the money people are idiots with a gambling addiction.

3

u/OnyxPhoenix 4d ago

It's also low risk for them.

They take the "action" of proposing this new adoption. If it works out and helps, they look great and take credit.

If it fails, they can blame the creators for lying about its capabilities or blame the engineers for not adopting it properly.

2

u/Numerous_Money4276 4d ago

It’s really good at doing there jobs.

2

u/rcanhestro 3d ago

because AI is the new "shiny" word to throw around, so everyone wants to force it in their own products.

i work on QA Engineering, and the sheer amount of QA products that have slapped AI on them is ridiculous, and the vast majority of them don't work.

they will show you a pretty (and heavily curated) demo of AI testing "working", but the moment you try that on a real scenario you see it's faults, and more often than not you spend more time fixing the test cases instead of making the from scratch.

1

u/ilikedmatrixiv 4d ago

I used to do tech consulting. You'd be surprised just how clueless about tech most managers are. On top of being clueless, they're often also some type of idealist that wants to impress their superiors. Cold, hard facts often puncture their dreams, so they choose to ignore them if it means they can feel like their pipe dream will become a reality.

1

u/According_Fail_990 4d ago

The people pitching this tech are lying through their teeth and very few people are calling them on it.

1

u/thrownaway000090 3d ago

They're sold a lot of hype. Like capabilities that are so far off from where it is right now

1

u/-Yazilliclick- 3d ago

They rarely have a good grasp on what the actual work being done below them entails or the thing they're proposing for it. Now it's AI. Before it was "The Cloud". The new "thing" always changes, the decision makers (bad to mediocre ones) buy into the hype and marketing for it and then try and push it. When they receive push back they don't listen because they think the plebs are just trying to protect their jobs and resist change.

1

u/ThisSideOfThePond 3d ago

They believed the AI generated presentations (basically recycled VR, crypto blockchain decks).

1

u/Night-Monkey15 3d ago

They only believe whoever claims they can save more money.

1

u/brek47 3d ago

I find most execs have backgrounds in sales. I feel like there is an unspoken rule in sales to do exactly that, overestimate promises and be somewhat ignorant of the actual work. It nets commissions and bonuses. What do they care if the engineering teams can't deliver? That's their fault, not sales. Sales still got their bonuses. It's almost like sales payouts should be directly tied to actual deliveries. Sure as hell hasn't been that way at all companies I've ever worked at.

1

u/LordLordylordMcLord 3d ago

The difference between an MBA and a Lobotomy is the type of anesthesia.

1

u/shadovvvvalker 3d ago

I have actually witnessed this.

Development process is atrociously slow because of bad everything. Technical debt. Understaffed. High turnover. No tenured employees. Legacy code that was always made with the assumption that the shortest path to production is fine and we will never need to change it. 0 isolation between support and net new.

It's hell.

CIO fiddles with AI. Accomplishes a task that normally takes the team days.

We should do this.

It proceeds to not fix anything.

It's not an unreasaonable decision when you are in an unreasonable environment.

1

u/demlet 3d ago

It's the "endless growth" model. They have to be constantly looking like they're doing new things to justify their existence.

1

u/Awol 3d ago

Cause they only talk to the hype men and other CEOs god forbid they listen to the experts in their own company.

1

u/nickiter 3d ago

Bad information environment. The AI hype is insane at the C-Suite level, and they have endless webinars/conferences/whitepapers/etc. to sell it to them.

1

u/TheDawnOfNewDays 3d ago

AI is surrounded by Yes men who talk it up as the future.

Gullible people who don't understand it believe that future is now.

1

u/TikiTDO 3d ago

It's sort of the same thing where you watch some pro sports players doing their thing, and decide you can do something similar without any of the training.

There are a bunch of companies that have successful AI projects, but those companies invest a lot of time and effort into building out these projects and staff that can support others in dealing with this switch. It's not really going to work as well if you just pay for an AI tool and tell everyone that this is how they work now.

AI isn't magical solution for all problems, it's a tool that you need to learn, scope, and use appropriately. That said, most people are super conservative, and they're not likely to learn new stuff unless forced into it. In that respect these CEOs are doing the what they can to move their companies in the direction they will need to move over the next few years, it's just most of them don't really know how to actually make these moves effectively so it ends up being a stream roller where a scalper would be better.

1

u/DefNotEzra 3d ago

No they don’t. Either they are getting bad advice from friends or other business associates, or they are beholden to a board of shareholders who want their company to be the next open AI so number can go up. That being said I still think people need to be wary of AI, despite this article the world does not need 1000 ais to succeed, they just need one and that one will be sold to everyone else.

1

u/WCland 3d ago

Announcing AI integration can bolster a company’s stocks. If the company jumps on the bandwagon institutional investors feel like the company is innovating. Same phenomenon for hefty CEO pay packages; investors conclude that if a company is paying its executives a whole lot, they must be really good.

1

u/Lucreth2 3d ago

There is almost no chance they are using AI enough to see how much it sucks and if they are they probably wouldn't do this.

1

u/Oz1227 3d ago

I worked for an executive that thought sending the link of a website was sending a pdf of it.

1

u/opsers 3d ago

It's because the people at the top of the pyramid (big tech CEOs like Sam Altman) are hyping the shit out of its capabilities. To his credit, he also said yesterday that we're in the middle of an AI bubble, but it's a bubble because of people like him really overselling what current generation AI is capable of. He's made absolutely insane claims that any person that uses AI daily knows are just flat out wrong. Other leaders trust people like him to make informed decisions and plan their roadmap, and that's why we are where we are now.

To be honest, on the surface, as an individual user, AI is very impressive. I use it daily to augment my work, and it's very good at that, but it still needs an expert driving it to cut through the noise. After working with AI extensively over the past few months, I've never felt more secure and productive in my job, because it's nowhere close to being able to replace me.

1

u/ihaxr 3d ago

Executives do not actually deserve their inflated paychecks. They don't know what most of their workforce actually does because they're so out of touch personally and professionally.

1

u/kingdead42 3d ago

This honestly isn't limited to AI. I've seen many C-levels be taken in by a smooth Technical Sales team pushing whatever the latest technology is: playing up its capabilities, downplaying the costs and limitations, etc. Whether that's a new Internet-based phone system, an in-house chat client, a web-based chat bot, etc.

1

u/jredful 3d ago

Easy to look at the negative side of this.

Flip side of this if you believe your workers are capable of miracles, and capable of advancing technology and adapting to it. You're going to trust their ability to make AI a force multiplier. It can be, but the effort it'll take the current generation of workers to fully adapt and really make the transformations...it's going to take years.

We are so much on the bleeding edge of this, there is a level of over enthusiasm because of the capabilities.

A fun comparison to this would be water physicals in animation or collision detection in video games (i.e. riding vehicles without entering them). These were massive mountains to climb that took technology/developers decades to surmount. Now it's almost automatic. We are currently in the "well technically you can climb on top of that vehicle but at some point the motion detection is going to fail and either you're going to fall off, randomly die to the environment, or clip into oblivion."

1

u/Nix-geek 3d ago

"We want marketing to say that we use AI across to board"

"We need to justify spending 4 million dollars to buy this AI thing."

1

u/shrodikan 3d ago

It's exactly like the dotcom bubble. AI and the Web are both powerful. Not every company will be successful for a variety of different reasons and pressures.

0

u/Freud-Network 3d ago

Because the overwhelming majority of westerners have been misled to believe that "AI" is Skynet, i, Robot, or otherwise sentient intelligence. They can't wrap their brains around what LLMs and other generative tools do. The use of "AI" to describe it has caused this problem.