More Developers Not Less: The Future of Coding and GenAI

More Developers Not Less: The Future of Coding and GenAI

Intro

If you believe the prevailing trend when you glance at your LinkedIn feed, GenAI is the precursor to the end of the developer and coding.

Since the GenAI revolution, I have been fielding questions like "What does this mean for the future of coders?", "What would you recommend someone studies at university now that coding is dead?", or "Where do you see software development in 10 years?"

Taking a cautionary approach in response to these questions, I usually state that developers will see an increase in productivity as the amount of code they can output will increase with the advent of GenAI tooling.

I then envision that this will progress to the automation of more simple coding tasks such as testing, which will eventually see the replacement of junior dev tasks fully by GenAI.

Which in turn, will lead to all keystrokes being done by GenAI and developers will become conductors of the orchestra - coordinating GenAI agents based on business requirements to release features and create new apps faster than ever before.

And if we look at the landscape today, we are getting there. The release of Kiro by AWS, Claude Code, and the continued development of products like Cursor have led to more GenAI generated code, increased developer productivity, and we are seeing this in action today. We are seeing vibe coding taking over as ideas are tested quicker than before.

Which then leads to questions such as "Should I study computer science?", "Is coding dead?", "What does the future of the software development industry look like?"

In fact, I saw an article on LinkedIn arguing that acquiring a trade would lead to higher income long term than studying computer science.

Perhaps this is the case? But I don't quite see it like that. In fact, I see a future where we have more developers than ever before.

To understand this perspective, there are a couple of things we need to consider:

  1. What have been the trends in software development human resources over time?
  2. What does our future look like as a world view?

Software Development and Human Resource Trends Over Time

Like many people when they start a career, you gravitate toward someone older in your team. Someone you can learn from, who usually has a far more cynical view of the industry after years of experience.

The same held true for me.

I met M whilst I was contracting in Ireland. M had 30 years of experience in computing by the time I met him, and had evolved with the great shifts in technology that many people struggle to keep up with. M had actually started as a programmer on the old IBM punch card machines. M had been around as assembly language took over, and wrestled with this change. Then eventually the high-level languages came in and C took over.

When I met M, he was happily coding - anything but Python was his passion. He could still whip out the old 8-bit binary from time to time with hexadecimal when we were building a data pipeline and the data just wasn't playing ball with the encoding.

And it was when I was thinking about M's journey through his software development career alongside the seismic changes in technology that I came to the conclusion that we will see more developers, not less, with the increase in GenAI capabilities.

Let me explain.

When software development started, punch cards were fed into machines that carried out computations. The number of developers was extremely small in comparison to the workforce, and these computers were huge - taking up rooms and rooms within buildings but having less computing power than the average handheld mobile phone in today's world.

M had actually started as a programmer on these old IBM punch card machines. In fact, M used to tell me stories that the first thing they had to do on a Monday morning was right the computers back up to vertical after they had fallen against the wall over the weekend as they carried out long-running tasks. No danger of that happening to your iPhone.

As technology progressed, assembly language came to the forefront where developers could use syntax to program computers rather than punch cards. M had witnessed this transition firsthand and wrestled with this change. Instead of creating physical punch cards, you could now write instructions using mnemonics like "MOV" and "ADD" - but you still needed to understand exactly how the machine worked at the hardware level.

Then we saw the emergence of high-level languages like Fortran and COBOL. These languages allowed developers to write instructions much closer to human language rather than machine code or assembly syntax. Suddenly, you didn't need to understand exactly how every processor instruction worked - you could write "ADD A TO B" instead of manipulating memory addresses directly or creating complex assembly routines. This opened up programming to more people and we saw our first significant increase in the number of developers.

Then came the personal computer revolution and modern languages like C, and later Java and Python. But perhaps more importantly, we saw the rise of Integrated Development Environments (IDEs). These tools provided syntax highlighting, debugging capabilities, and automated many of the tedious tasks that developers had to do manually. What once required deep system knowledge could now be done with point-and-click interfaces and automated assistance.

Each step in this progression - assembly to high-level languages to IDEs - made programming accessible to a broader audience. Each generation of tools abstracted away complexity and let developers focus on solving problems rather than wrestling with the underlying technology.

This is all well and good - but doesn't GenAI spell the end to this trend?

No, not exactly.

GenAI represents the next step in this progression - assembly → high-level languages → IDEs → GenAI. And this time, the programming language of choice is English.

Computing is now more accessible than it has ever been, with most adults carrying around a handheld computer at all times, and GenAI opens up the world of development to even more people as they can now code in English. The skill is now in crafting the user prompt in order to get the inference correct so you get the output you wanted from your request.

Does this mean I should no longer learn coding in the traditional sense?

Software and the Future World View

Take a minute and imagine the future. What comes to mind? What do you think the world will look like in 30 years time? 50 years time? 100 years time? What visions are conjured up?

If we look at sci-fi, we will be in an ever increasing digital world. AI assistants that carry out our every wish. Holo decks for recreational use. Smart cities that adapt to our needs in real-time. Autonomous everything - from cars to entire supply chains.

And what's preventing that from happening as of today?

Well, one of those factors is the output rate versus quality of software.

Creating good software costs money. It takes time. It requires skilled developers who understand not just how to write code, but how to architect systems that scale, how to debug complex problems, and how to build software that actual humans want to use.

But here's where GenAI and coding in English changes everything.

When we remove the barrier of traditional programming languages and open up development to natural language, we don't just make existing developers more productive - we unlock an entirely new workforce.

Suddenly, domain experts can directly translate their knowledge into working software. Business analysts can prototype solutions in real-time. Product managers can iterate on ideas without waiting for development cycles.

This opens the door for more software developers creating better code and leading to an ever more advanced digital age. We're not replacing developers - we're multiplying them exponentially.

Think about it this way: every person who has ever had an idea for an app, a tool, or a solution but couldn't code it themselves can now potentially bring that vision to life. The bottleneck shifts from "can you code?" to "can you think clearly about what you want to build?"

But here's the crucial point - and this ties back to M from the beginning of our story.

Knowing the fundamentals of coding is still relevant to being a really good developer in this new age. Because if you understand the first principles - how memory works, what makes code efficient, why certain architectural patterns exist - then you can much quicker develop the outcome you're looking for.

M could debug that data pipeline encoding issue precisely because he understood the underlying systems. He knew why the data was misbehaving because he had worked with binary and hexadecimal for decades. That foundational knowledge didn't become obsolete when high-level languages emerged - it became his superpower.

The same principle applies today. The developers who thrive in the GenAI era won't be those who blindly accept whatever the AI generates. They'll be the ones who can quickly spot when something doesn't look right, who understand why the AI suggested one approach over another, and who can guide the AI toward better solutions.

So should you study computer science? Absolutely. Should you learn to code the traditional way? Without question.

Because in a world where everyone can generate code in English, the people who truly understand what that code is doing - who can optimize it, debug it, and architect it properly - will be more valuable than ever.

We're not heading toward fewer developers. We're heading toward a world where the best developers become force multipliers for entire teams of AI-assisted creators. And there's never been a more exciting time to be part of that future.