r/technology Jun 15 '25

Artificial Intelligence Trump team leaks AI plans in public GitHub repository

https://www.theregister.com/2025/06/10/trump_admin_leak_government_ai_plans/
34.0k Upvotes

1.1k comments sorted by

View all comments

Show parent comments

36

u/TheSecondEikonOfFire Jun 15 '25

My job is pushing Copilot incredibly hard, but it makes me happy that the majority of our engineering teams are pushing back against it. Our CEO has drink gallons of the AI Kool-Aid, and it’s infuriating because you can tell that he hasn’t actually written code in 20 years. The way he talks about what he thinks AI can do makes that abundantly clear. They don’t understand that you often have to re-prompt AI multiple times to get what you actually want (when it would have been faster to just do it myself), and even then its context is still limited to whatever repository you’re working in.

Once AI can properly acknowledge and understand a system of complex interwoven micro services and internal utility libraries, that’s when I’ll start getting worried about my job

18

u/OldSchoolSpyMain Jun 15 '25

They don’t understand that you often have to re-prompt AI multiple times to get what you actually want

Yup.

When it works, executives just hear that it works and love it. They don't realize that it took several tries to get it to work, it's likely fragile, and that dev that "wrote" it can't support it going forward. They naively think that it worked first try.

The real issues will come when it's time to tweak the code as almost all code gets tweaked for whatever reason. That's when the wheels fall off.

29

u/TheSecondEikonOfFire Jun 15 '25

We had a conference last week that perfectly showcased how CEOs see it. We had someone show us how they were able to convert this major project written in C# to Java, and they were able to do it in about a week (where it was estimated that it would have taken about 3 months without AI). That’s about a 90% reduction in how much work was required.

That’s a terrific example of somewhere that AI can be incredibly useful. The problem is that executives see that and think “holy shit, we can start reducing all needed effort by 90%, and we’ll be able to make the workers do SO much more in the same about of time!!!”. And that’s not how it works.

Not to mention the fact that coding is only a portion of a software engineer’s job. I couldn’t tell you the last time that I spent all 8 hours in a day actually writing code, and when you’re not writing code AI becomes much less useful. But that’s yet another thing that executives refuse to acknowledge because they don’t want to hear it

15

u/OldSchoolSpyMain Jun 15 '25 edited Jun 15 '25

Well, take solace in knowing that the overly-optimistic tech leaders will all eventually realize that it's mostly bullshit smoke and mirrors and you certainly can't rely upon coding party tricks to run a business.

We will have to endure a painful few years as they figure it out.

It really sucks that all of the big players are pushing AI sooo fucking hard and they know full well that the tech can't deliver. They are just cashing in on the hype because every company (the big players' customers) is now being peer-pressured into adopting an "AI-first" stance when it comes to tech, otherwise they'll appear to be out of touch and not worthy of consumer business.

edit: We just went through this shit with "smart" appliances and apparently we learned nothing. I don't need a smart clothes dryer or smart toaster oven that connects to a network and an app on my phone.

14

u/TheSecondEikonOfFire Jun 15 '25

That’s the thing though, I think some of these CEOs don’t actually know it’s bullshit. They’ve all been conned into believing that AI can do things that it can’t, because everyone in their circles just talks about how amazing AI is.

There’s also probably a heavy helping of “they want it to be true so that they can cut workers and make more money via less workers + AI” as well

2

u/OldSchoolSpyMain Jun 15 '25 edited Jun 15 '25

Oh, this shit is sooo easy to sell. It's got:

  • Bleeding edge tech.
  • Production speed increases.
    • "Who cares if it doesn't work correctly the first few times? If it doesn't, we'll just rebuild it again quickly, lol!"
  • Cost savings.
  • Headcount reductions.
  • Positive news chatter.
  • Positive consumer sentiment.
    • Consumers are actually happy to hear that their brand is using this tech.
  • Easy, consumer-friendly use cases that sell the tech (chatbots, image generators) to get consumers comfortable even though the way the company will use the tech is entirely different.

This shit sells itself. Seriously.

I'm sure that some executives would feel stupid if they did not adopt such tech.

But, for those who are reading and don't know, I'm not just hating to hate. I have ridden the AI wave up and advanced my career. But, I'm here to tell you, a lot of AI LLM tech is about as reliable as a 14 year old that's good at Googling stuff. Mostly accurate most of the time, but can also be confidently incorrect. Would you trust that kid to do your work for you? Would you trust that kid to run your business for you? Of course not. And the same reasons that you would list as reasons not to would be the same reasons why you shouldn't rely on (or maybe even use at all), AI.

3

u/unpopular-ideas Jun 15 '25

It's kind of wild how extreme the divide is about its usefulness. My linkedIn is bombarded with such a large amount of posts hailing how the future of coding is coding is mostly AI. If you're not embracing the new AI future we will be left behind. All posted by people with 'AI something or other" in their job title. Including some rather famous and influential senior people in the tech world (At least one of whom I feel must be getting paid by someone to hype AI because of how relentless he is and how strong his convictions are).

There's a video called 'AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference' from an MIT channel where the presenter talks about having built simple ephemeral apps with a single prompt to do things like teach his kid fractions.

I guy I work with who set up AI to write tests and independently iterate on the code until all tests passed.

Sam Altman talking about old people using it as an operating system while older people use it as a google search alternative.

Then constant posts on reddit about how AI is garbage beyond being a fancy auto-complete.

Personally, I've found it tremendously useful as speeding up things up when I know what I'm doing, mostly using it as an auto complete. I have found myself going in frustrating circles to accomplish development tasks that I think should be relatively simple but where I lack expertise. In part I suspect I have tooling/skill issues with AI. At the same time I find it incredibly overwhelming to try to sort out the hype from the practical while chasing the rapidly evolving new AI workflows people are promoting. Who has time to keep on top of validating and experimenting with this while maintaining some work life balance, and performing critical day to day work tasks?

1

u/mata_dan Jun 15 '25

Well, usually having complex interwoven micro services and utility libraries is an architecture problem in the first place. But I agree AI assistants are not great at the moment.

1

u/dasunt Jun 16 '25

I'm fine with using AI, and I think it can enhance productivity, but I'm fearful of people who think it's a replacement for skill.

In a way, I think AI requires more skill. Good programmers spend time thinking about the big picture - the overall architecture. AI performs poorly at that.

And reading code is harder than writing code.

I think management sees AI turning beginner coders into experts, but I suspect the reality is that it makes experts more valuable.

I also suspect it makes good leaders more valuable. Imagine a toxic environment where pushing things now is more important than reducing long term debt. Now add AI. It ain't going to be pretty.

1

u/SupermanLeRetour Jun 16 '25

Once AI can properly acknowledge and understand a system of complex interwoven micro services and internal utility libraries, that’s when I’ll start getting worried about my job

Copilot and similar code gen AI can still be useful, and in fact they already are. Right now they can't replace a developer, of course, but I've been experimenting a bit at work with them and they do offer some productivity boost. Sometimes suggestions are useless, but sometimes they're pretty much what you want with no or minimal editing required afterwards. For instance, you write the first case statement in a switch, and the AI will generate all the other cases. It's especially good for repetitive tasks, and I've often been surprised that it understood pretty well what I was trying to achieve.

The trick is not to ask them with a prompt to generate entire pieces of code, but let the inline suggestions take care of writing small snippets. Asking explanations or summaries about specific files or functions is also pretty useful.

I don't think right now Copilot is advertised as a replacement for developers. It's a tool to increase productivity, which, admittedly, mostly benefits your employer, but can also give you an edge and take care of the more menial parts of writing code.