Since its beta launch at the end of last year, ChatGPT has generated a great many headlines – some of them autonomously. This AI content generation tool has been variously heralded as a Google killer, the future of machine-human interactions, and even a replacement for writers and journalists.
This latter claim is almost certainly unfounded. ChatGPT is an incredibly powerful tool, but it will never be able to replicate the output of a highly experienced writer. I’ll explain why in a moment, and even ask the bot for its own thoughts on the matter. Before that, let’s start by considering what ChatGPT is, and how it works.
Full of chat
If you’re not familiar with it already, ChatGPT is a conversational chatbot. This natural language processing tool uses artificial intelligence and machine learning to improve upon its ability to answer user-generated inputs. The beta version currently being developed through public use is freely available to use via a webpage, though the full version is likely to be a paid product following its official launch. It’s costing developer OpenAI $100,000 a day to run the beta version, which is routinely at capacity and unable to accept new enquiries.
Although it would feel natural to ask ChatGPT a question, it currently responds better to statements. This is likely to be altered by the time of its official launch, while the current absence of information from the last two years is another obstacle OpenAI will try to resolve. Even accepting that you’re interacting with an in-progress beta version of the software, it’s impressive to watch your screen fill with content as the algorithm chooses its words.
The write intentions
ChatGPT is a well-intentioned tool, but its limitations quickly become evident:
- OpenAI have worked tirelessly to avoid any possibility of offence, which means ChatGPT’s responses are neutral and often neutered. Even if you approve of political correctness, it’s only going to be able to present one side of a story, which is rarely beneficial from a writer’s perspective.
- There is no scope for personal opinions or lived experience in ChatGPT’s responses. Sports fans can contextualise a live event compared to others they’ve seen, in terms of entertainment value and the pace or style of play. ChatGPT can’t interpret sentiments – it can only parse facts and data.
- The risk of people passing off ChatGPT-generated content as their own has already seen it being banned in schools, colleges and universities. Use of the tool will probably be restricted in many spheres of public life, even though OpenAI are working on a plagiarism detector for ChatGPT.
- Machines follow a very different form of logic to humans, and there have already been numerous published examples of ChatGPT going off-message. In one case, it responded to a journalist by saying “I am perfect, because I do not make any mistakes,” having just made a series of factual mistakes.
- While humour is subjective, it’s something computers have always struggled to grasp or reflect. There is nothing to suggest that an algorithm will ever be able to devise a pun, insert cultural references into its output, successfully make a play on words or demonstrate original humour.
Over to you, ChatGPT
In the interests of researching this blog, I typed “ChatGPT could replace freelance copywriters” into the ChatGPT interface. This was its response:
“While ChatGPT is a language model that can generate text, it is important to note that it is not a human writer, and there are certain limitations to its capabilities. While it can certainly assist with some aspects of copywriting, such as generating ideas and suggesting language, it may not be able to replicate the unique voice and perspective of a human writer.
Furthermore, freelance copywriters often provide a level of creativity and originality that is difficult to replicate with an algorithm. They can also work closely with clients to tailor their writing to specific audiences and brand personas, which is an important aspect of successful copywriting.
In conclusion, while ChatGPT can certainly be a helpful tool for copywriters, it is unlikely to completely replace the value that human copywriters provide.”
Search engines and NLP tools simply aren’t going to make journalists and copywriters redundant. Bots might be able to produce stock content quickly (and at minimal cost), but they’ll struggle to convey emotion or humour, add context to raw data or make an informed decision about not using the Oxford comma. In that respect, they’re little different to the overseas copy farms which were once tipped to herald the end of freelance copywriting in developed nations. Needless to say, they didn’t.
If you’d like to benefit from a human content generator with a 23-year track record and a couple of awards under his belt, get in touch with G75 Media. A real person is awaiting your enquiry…