https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/
GitHub Copilot is powered by OpenAI Codex, a new AI system created by OpenAI. OpenAI Codex has broad knowledge of how people use code and is significantly more capable than GPT-3 in code generation, in part, because it was trained on a data set that includes a much larger concentration of public source code.
Will Copilot or similar systems become ubiquitous in the next few years? Will they increase the speed of software development or AI research? Will they change the skills necessary for software development?
Is this the first big commercial application of the techniques that produced GPT-3?
For anyone who's used Copilot, what was your experience like?
This will probably make the already-bad computer security/infosec situation significantly worse. In general, people pasting snippets they don't understand is bad news; but at least in the case of StackOverflow, there's voting and comments, which can catch the most egregious stuff. On the other hand, consider the headline snippet on the Copilot's landing page:
Everything below the function prototype was AI generated. And, this code has TWO security vulnerabilities! First, it's using an http URL, rather than https. And second, if the input string has newlines, it can put values into fields other than the ones intended. In this specific case, with the text-processing.com sentiment analysis API demo, there's not much to do with that (the only documented field available other than
text
islanguage
), but if this is representative, then we are probably in for some bad times.This part I think is not quite right. The counterfactual jim gives for Copilot isn't manual programming, it's StackOverflow. The argument is then: right now StackOverflow has better methods for promoting secure code than Copilot does, so Copilot will make the security situation worse insofar as it displaces SO.