What Open AI's GPT-3 means for developer jobs
Written on 2nd Aug 2020
If you have been on Twitter, or anywhere on the internet for that matter, over the last few weeks, you will have come across the cryptic abbreviation "GPT-3", which stands for "Generative Pre-Training (of language models)", version 3. While its predecessors GPT and GPT-2 were far less effective and therefore less impressive, version 3 has caught on in the development world. You might have seen examples, where GPT-3 is trained to generate React components or SQL queries just by giving it text input. Developers fear for their jobs, but should they?
While the examples mentioned above are impressive, the reality is, that there is a difference between these presentations and the reality of using an AI algorithm to code complex applications. AI models, such as GPT-3, tend to misrepresent certain user input and might, therefore, go in a completely wrong direction very quickly.
Product owners will replace developers with AI
While this looks reasonable on the surface, it becomes clear that a product owner or project manager will hardly be capable of using a tool like GPT-3 to replace developers. It is difficult for a developer at times to understand and work through product requirements set by non-technical product owners, so how should an AI algorithm decrypt these requirements?
If you look closely at the examples given by Open AI and the first public beta users you will notice, that the input needs to be written in a "sterile" syntax, that is easy for a machine to understand. It reminds me personally of the Gherkin syntax used for BDD tests in certain environments. To give GPT-3 instructions it understands and executes correctly, you need to be a developer, or at least have a certain level of coding experience, to write these instructions. Therefore, it becomes just a layer of abstraction on top of programming languages. A certain level of standardisation would be required to make the output predictable, looking at the way GPT-3 appears to work at the moment.
GPT-3 is a fancy new interface to generate code (and of course other output, I am just focusing on the implications on developers here, so not going into depth on different use cases). In my opinion, it will not replace developers anytime soon. Instead, it will become another tool we can use to generate code.
A great use case for GPT-3 would be the generation of a lot of the boilerplate code we have to write on a daily basis, where snippets and frameworks are not convenient because a more custom approach is required. Instead of fearing for our jobs, we should think of ways we can use a technology like GPT-3 to our advantage.
Lastly, product owners who dabble with GPT-3 will soon realise that it is not a replacement for developers, just like code-free solutions such as Webflow, Wix and so on won't replace developers for companies with advanced requirements. My conclusion is, therefore, that our jobs are secure for the foreseeable future.