I have been using GitHub Co-Pilot as a part of my daily job for over a year. TL; DR: in three years I expect some improvement, but not beyond simple functions.
Right now, copilot is most useful for converting data between formats, and writing out boilerplate. It is surprising how often in software development (especially server-side) you need to change the shape of data. Essentially, as long as there are established patterns, it is helpful; however so not expect it to write software for you any time soon.
So far these systems are still fairly narrowly scoped. It can write a simple function, but I haven’t seen it be able to create abstractions. It really doesn’t have an understanding of the code, and even now it isn’t very good at matching parentheses or brackets.
Now I don’t expect copilot of three years from now to put me out of a job, but I do expect that it will do more of the typing for me. I think that I’m still going to have to convert business decisions into the right abstractions, but I hope that I’ll be writing fewer tests by hand.
Until then, it’ll continue writing plausible nonsense, which sometimes happens to be useful.
The flurry of news about ChatGPT (and my recent experimentation with GitHub Copilot) has got me thinking about the effect natural language models are having on software engineering. AI-assisted development is improving at a blistering pace, but not just the models themselves – people are figuring out ways to use AI everywhere from explaining exceptions to writing terminal commands.
I want to hear people's thoughts on where this is going. What might software development actually look like in 3 years, concretely? Who knows, maybe it will be replaced by prompt engineering. And will this decrease the demand for technology workers, or increase it? AI capabilities are difficult to predict, so I'm also curious about what devs should start learning to stay ahead of automation.