I expect for there to be a delay in deployment, but I think ultimately OpenAI is aiming as a near term goal to automate intellectually difficult portions of computer programming. Personally, as someone just getting into the tech industry, this is basically my biggest near-term concern, besides death. At what point might it be viable for most people to do most of what skilled computer programmer does with the help of a large language model, and how much should this hurt salaries and career expectations?
Some thoughts:
- It will probably be less difficult to safely prompt a language model for an individual "LeetCode" function than to write a that function by hand within the next two years. Many more people will be able to do the former than could ever do the latter.
- Yes, reducing the price of software engineering means more software engineering will be done, but it would be extremely odd if this meant software engineer salaries stayed the same, and I expect regulatory barriers to limit the amount that the software industry can grow to fill new niches.
- Architecture seems difficult to automate with large language models, but a ridiculous architecture might be tolerable in some circumstances if your programmers are producing code at the speed GPT4 does.
- Creativity is hard to test, and if former programmers are mostly now hired based on their ability to "innovate" or have interesting psychological characteristics beyond being able to generate code I expect income and jobs to shift away from people with no credentials and skills to people with lots of credentials and political acumen and no skills
- At some point programmers will be sufficiently automated away that the singularity is here. This is not necessarily a comforting thought.
Edit: Many answers contesting the basic premise of the old title, "When will computer programming become an unskilled job?" The title of the post has been updated accordingly.
The most accurate answer is also the least helpful: none of us really know. Guido van Rossum has an opinion about GitHub Copilot in this interview:
but he's really just talking about what LLMs can do know, not what they'll be able to do in five or ten years.
Chris Lattner has an opinion about Software 2.0 here:
but Software 2.0 isn't really the same thing. But he's talking about Software 2.0, which is a little different. More info about Software 2.0 here:
and if you watch Chris Latter and Lex talk for a little while longer, you'll see that Chris has no idea about how you can tell a computer to build you a webpage with a red button using just text, and admits that it's out of his area of expertise.
I bring up these examples mostly to illustrate that nobody has any clue. Sam Altman addresses the topic the most out of all the people I've linked, in this video:
and the TLDR is that Lex and Sam both think LLMs can make programmers 10x more productive. Sam also thinks that instead of hiring 1/10th the number of programmers, we'll just have 10x more code. He thinks there's a "supply problem" of enough software engineers.
One thing I would advise is to make yourself more than just a software engineer. Lex says in his talk with Sam that he's not worried because he's an AI guy, not just a programmer. You might want to learn more about how AI works and try to get a job in the space, and ride the wave, or learn about information security in addition to software engineering (that's what I'm doing, in no small part because of the influence of a one-on-one chat with 80,000 Hours), or maybe you learn a lot about oceanography or data science or something else in addition to software engineering.
Then I'd also just say that we have no idea and if anyone says they know, they really don't, because look a bunch of smart people discussed it and they have no clue either.