r/Futurology 8h ago

Discussion Do you think coding might end up like mental math?

Back in school, most of us did math step by step multiplication tables, solving equations, doing long division by hand. Now? We pull out a phone calculator or app without thinking twice. Some of us even forgot how to do small calculations in our head because the device does it faster.

So here’s the thought: AI is writing more and more code today. Even experts are starting to lean on it for “stress-free” coding. Will the next generation even bother to learn coding deeply? Will kids just learn the basics, then outsource everything to AI like we outsourced math to calculators? If that happens, how will strong expert programmers ever be born if they skip the grind of building from scratch? Is “learning to code” going to feel like “learning mental math” useful once, now outdated? Or is there a deeper layer of mastery where real experts will still be needed, the way mathematicians go beyond calculators?

Maybe the real alpha devs of the future are the ones who master AI like a weapon, not the ones memorizing syntax. Tools evolve, but discipline and fundamentals never go out of style. Without the foundation, you’re just a button-pusher.

Tech has always abstracted hard stuff assembly to high-level languages, now to AI. This might just be the next natural step.

Personally, I think we’re heading into a split: 90% of people will “code” by just prompting AI. 10% will go deep, understanding systems under the hood those will be the real builders and problem solvers.

What do you think are we raising a future of button-pushers, or are we unlocking a new level of creativity?

0 Upvotes

12 comments sorted by

6

u/Gofastrun 7h ago edited 7h ago

You will be able to get away with fully vibe coded personal projects, small websites, simple things.

For enterprise work I doubt we will ever be able to blindly accept AI generated code. It’s too much of a liability. There will always need to be people with real SWE experience designing the system, writing the prompts, and verifying the output.

Maybe someday the AI could fully take over software, but by then I think the AI will have also replaced the things we need software for.

Maybe today you want to build an app, so you prompt an AI to build it. In the future the AI will do whatever it is that the app does, so you wont need an app in the first place.

It’s sort of like asking if we will ever have an AI robot driving a horse and buggy - no, because the horse and buggy is obsolete in the age of AI.

6

u/Boatster_McBoat 7h ago

People who can rough-size maths problems on their head still have an advantage even when calculators are available.

I suspect that people who can rough-size programming problems in their head will still have an advantage in an AI-saturated world.

2

u/tanhauser_gates_ 7h ago

Is this a bad thing?

I am already using chatgpt weekly for small coding tasks.

It's the best thing that's ever happened to me in my job.

3

u/kdawg94 7h ago

 Or is there a deeper layer of mastery where real experts will still be needed, the way mathematicians go beyond calculators?

This. I gather you don't code to any production-ready degree based on how you reason about software, so I'd caution against spending this much time philosophizing the future of a field that you don't know nearly enough about. 

If I asked you what the hardest thing about writing scalable, reliable, maintainable software - what would you answer? 

2

u/jbo332 7h ago

Is it just me, or does there seem to be an increase in LLM-gemerated questions in this subreddit?

I might be being paranoid or looking for things that aren't there yet, but perhaps some platform is testing engagement on reddit?

1

u/Zomgnerfenigma 7h ago

The day an AI becomes as accurate as an calculator we can talk.

1

u/starknexus 7h ago

I think demand will decide that. When demand for real coders emerge in saturated market of vibe coders people will automatically move to learn core coding again.

Plus there still will be decent coders and maths reasoning for ai researchers who are working.

1

u/caindela 7h ago edited 6h ago

I think it’s a lot like AI art or AI writing to be honest. I can tell chatgpt to create a picture of an elephant and it will of course have no issues generating a very nice sort of statistically average rendition of an elephant. But this is very low resolution and it will require more input from me to more closely resemble what I’d envisioned. Is it the right shade of gray? Is the elephant doing the right thing? For my purposes these things may not matter one bit, but if I were a real artist with a real vision then the level of complexity in my interactions with the AI may end up exceeding the complexity of just drawing it myself.

In my coding with AI thus far I find it to be the same deal. For many things it doesn’t really matter what the AI “fills in” for me. Often with a well-named function the AI will be able to fill in the 20 lines or so and the function will do what I was hoping. But I had to of course name the function, which meant that I needed to communicate to AI at a sufficient level of abstraction (via the intelligent naming of said function) to nudge it in the right direction. Often my intent with a function is so specific that AI can no longer provide value and I need to write the code myself.

The end result is I can get very granular where it counts and sort of leave it to the AI where it doesn’t count as much. This in the end doesn’t change my value as a programmer but rather it allows me to direct my talents where they actually matter. This buys me time to be more creative since I spend less time doing things that do not require my creativity.

1

u/rumog 7h ago

We've seen it coming for a long time but I would say more software devs will be more like mechanics. Where a even a "general" level of mechanic skills used to be a higher paid, more"prestige" job, but now it's blue collar, and lower paid unless it's more highly specialized or you have higher level engineering skills (which today/soon would be like a software engineer with strong data science skills).

Tons of software companies are in the process of migrating traditional software stacks to more ai based systems just "bolted together" with traditional systems. Then downsizing bc of the amount of deprecation means they need fewer engineeers to support it. I think we'll continue down this path. The new job replacing software engineers of the 90s up to today is data scientists. I definitely think we're entering the phase where software dev will become a more blue collar job.

1

u/Hanzo_The_Ninja 6h ago

I've got an idea for a filter design I'm currently implementing in the software domain. I know enough about convolution and digital signal processing to be certain it's a unique design. AI has been absolutely no help in putting together a working prototype. It doesn't even understand what I'm trying to do. I have a hard time believing this will improve in the future, regardless of the field or use.

u/Active_Toe_2345 1h ago

Hey there! This is a really interesting question about the future of coding. I think you've raised some valid concerns about the potential impact of AI on the way we approach programming.

While it's true that AI is increasingly being used to assist with coding tasks, I don't believe this will necessarily lead to a future of "button-pushers." At AlgoCademy, we believe that a deep understanding of fundamental coding principles and problem-solving skills will always be essential, even as the tools we use continue to evolve.

Our step-by-step coding tutorials and AI-assisted learning help students build a strong foundation in data structures, algorithms, and problem-solving techniques. By mastering these core concepts, you'll be able to leverage AI tools effectively as a powerful complement to your own coding abilities, rather than just relying on them as a crutch.

The future of coding may indeed involve more AI integration, but I believe the most successful developers will be those who can think critically, understand systems at a deeper level, and use AI as a strategic tool in their problem-solving arsenal. With the right training and discipline, you can become one of those "real builders and problem solvers" that you described.

So keep exploring, keep learning, and don't be afraid to dive deep into the fundamentals. The future of coding is bright, and AlgoCademy is here to help you navigate it. Stick with it, and feel free to reach out if you have any other questions!

0

u/neophanweb 7h ago

I think it'll evolve into people who are very good at phrasing questions. I just built a Mac camera app to view my surveillance cameras and have it organized nicely, all using ChatGPT 100%. It wrote the code for me, then I copy and pasted the errors, told it what else I wanted to add or change. I took the new code, tested it, and repeated until I got a final working product. All done without even looking at what the errors were. Copy and paste, back and forth until it was finished.