Tag Archives: generative AI

The importance of editing AI content     

Generative AI arrived in the public consciousness a couple of years ago with all the subtlety of a car bomb. Its exponential growth has been driven by three factors – AI companies’ desperation to grow at a quicker rate than their competitors, staggering sums of external investment, and a desire to be part of a rapidly evolving market which as yet has no boundaries or limits.

Along the way, generative AI has caused much mirth with its hallucinations and absurdities. It’s briefly titillated us on video platforms like the short-lived Sora, where everything you saw was entirely fictional. And it’s also upended the jobs market, with companies abolishing everything from entry-level roles to advanced positions previously requiring decades of experience. Some analysts predict a high percentage of the world’s jobs may be lost to AI, yet in contrast to previous technological revolutions, there seems little scope for new jobs to be created as old ones are rendered obsolete.

A copywriting canary in the coalmine

As a copywriter, I’ve been at the sharp end of this. In 2024, five of my clients ceased providing work to freelancers because of AI and search engine changes. In 2025, I lost another three. The impact on my bottom line has been dramatic. So when I was offered the chance to edit AI content for a major automotive client, it felt like a valuable opportunity to examine my ‘competition’.

There is no doubt that AI engines like Claude, Perplexity and Jasper can generate reams of highly technical copy far quicker than any writer. I’ve also acknowledged in a previous blog how effective Microsoft CoPilot can be if it’s dealing purely with content inside your company’s data ecosystem. The problems come when AI engines start drawing information from the wider internet, with all its inconsistencies and inaccuracies, and presenting it as fact.

In the process of editing AI-generated articles, I’ve seen some truly bizarre issues arise:

  1. An AI engine kept trying to apply the same Q&A questions to different topics, leading it to pose increasingly absurd questions which it was then unable to answer.
  2. One AI-generated paragraph made perfect sense, right up until it hallucinated something that had never happened in the middle of a sentence.
  3. An AI article stated quite confidently that something which took place five years ago had first happened in autumn 2026.
  4. Another AI article made something up – but then included a link to an unrelated Wikipedia page, which made it look at first glance as though this statement had been corroborated.

I could give many more examples, but it might give away who I’m working for, so I will draw a discreet veil over my other findings.

Why does this matter?

Humans make mistakes, and even authoritative platforms like Wikipedia (a favoured resource of AI engines) contain mistakes. People aren’t infallible, so why should we expect machines to be?

The short answer is that machines should be logical, which hallucinations clearly aren’t.

The longer answer is that the engines present their content as authoritative, with no need for human input. Yet when a broadsheet newspaper makes a mistake, IPSO forces them to publish a correction. ChatGPT and its ilk have no reason to care if they make a mistake, but if you publish that content and get in trouble for it, it’s your reputation on the line. Don’t expect the AI company whose content was responsible to submit a witness statement on your behalf in court.

For this reason, sense-checking AI-generated content is essential before any business publishes anything on any public or client-facing portal. Human oversight from a qualified and experienced writer will also ensure a number of other benefits, beyond accuracy:

  1. Content can be restyled to match corporate house writing styles, from language and use of key terminology through to consistent punctuation and formatting.
  2. An editor can add in some of the things AI engines can’t generate – humour, personal opinion, emotion, lived experience.
  3. AI copy is prone to repetition, often making the same point two or three times with very similar (or even identical) wording. Human oversight will identify and remove unnecessary duplication.

It’s also worth considering that AI engines are engaged in plagiarism and copyright theft of digital content on a scale which makes Napster and its peer-to-peer ilk resemble a few people trading cassette tapes. Innumerable lawsuits are underway around the world from enraged content creators, from writers and artists to film and media outlets. Legal wheels grind slowly, but these cases may prove to be cataclysmic for generative AI brands. Some firms are already hundreds of billions of dollars in the red due to start-up costs, plus the vast energy and environmental costs of running all the servers and processors needed to meet daily demand for their services.

The best things in life are rarely free

You might wonder why you should pay a writer to edit something AI has produced for free, but that’s the point. If you’re not paying for something, it’s unlikely to be high quality – and it won’t be free for much longer anyway. Even if generative AI companies successfully defend every legal case in the next ten years (which seems inconceivable), the incalculable sums of money being invested in generative AI tools will eventually need to be repaid as dividends or profits. The only realistic way for this colossal debt to be settled is if AI companies start charging for the use of their tools. And once you have to pay £1,000 a month for a subscription, it’d be cheaper to use a freelance writer.

In the meantime, if you’re thinking about commissioning AI content for your business, ensure it’s accurate and authentic by asking G75 Media to provide our award-winning editing services. Get in touch with us to discuss a quote.

How Microsoft Copilot can help small business owners

Microsoft’s track record of new technologies and product launches has been patchy over the last four decades. For every Windows XP or 11, there’s been a Vista or Windows 8. For every successful Word and Edge browser it’s developed, Microsoft has produced a HoloLens or Windows Phone. And while Microsoft has shrewdly acquired some very influential companies – LinkedIn and Activision to name just two – it has also sunk huge sums into the likes of the Invoke smart speaker, Nokia Lumia smartphones and the Yammer social network tool.

When Microsoft unveiled Bing Chat two years ago, there was a muted response to this new generative AI chatbot, especially since Bing has always been a poor relation to the all-conquering Google search engine. However, having rebranded Chat as Microsoft Copilot, the software has evolved to be far more powerful and effective. It now offers genuine benefits to small business owners and entrepreneurs, so if you’re don’t yet appreciate Microsoft Copilot’s talents, it’s worth reading on…

Flying high

Microsoft Copilot is essentially an automated solutions platform which aims to resolve user queries. For instance, the 365 Copilot is integrated into other Microsoft 365 applications and utilities. If you want to find out how many fields in an Excel spreadsheet incorporate a particular term, you could type in “How many fields have Open status in column D”. Within a few seconds, Copilot will create a formula which resolves this query, before displaying an Insert Cell button. Clicking it inserts the formula and instantly displays the number of fields marked as Open across that particular column. Equally, it can also adjust formatting, create graphs, identify trends and summarise data.

Copilot performs other tasks which are more commonly associated with generative AI, such as image creation. Log into Microsoft Designer, ask it to create an image and then type in a particular search string – a faulty broadband router in a domestic home, for instance. After around fifteen seconds of processing time, four broadly similar images will be displayed of a router covered in red warning lights. These images can then be edited or downloaded as the user wishes.

Growth and costs

The real breakthrough for Microsoft Copilot was the company’s decision to incorporate it into Windows 11, positioning it on the Taskbar and ensuring a Copilot icon is displayed in iconic Microsoft packages like Word and PowerPoint. By constructing it using the pre-existing GPT-4 large language model developed by OpenAI (a company Microsoft has invested heavily in), Copilot was able to hit the ground running and gain advocates at a startling rate.

Like many software packages, Copilot operates a freemium model. It’s possible to access basic features free of charge or pay to unlock a Pro subscription which offers the latest features and the ability to create a custom chatbot. There are chatbots dedicated to travel, cooking and personal fitness, while Copilot can discuss websites as you browse them. It can serve as a translator, source information from the internet, or even check product inventory and shipping data when you grant Copilot database access.

Should I be concerned about Microsoft Copilot?

There are legitimate concerns about the removal of human involvement in automated processes, especially when Copilot drafts up an email on your behalf by scanning previous emails, giving you the option to vary the tone of the new email depending on how assertive (i.e. angry) you want to sound. It’s one thing letting AI summarise a Teams call, but it’s quite another letting it produce corporate communications. Microsoft is unlikely to accept responsibility if your company issues a statement/email/report/spreadsheet with Copilot-generated inaccuracies in it. Clients will also take a dim view of receiving AI-generated content, especially if it contains mistakes.

Hallucinations remain a problem for AI models, while their attempts at political correctness have hitherto resulted in images of black Vikings, or seen foodbanks listed as tourist destinations. Some might dismiss these errors as teething troubles, but there is a very real issue of incorrect AI-generated results being fed back into the source material for these AI engines, creating a negative loop of increasingly wayward output. This is going to become more problematic as the large language models powering AI engines run out of existing internet content to plagiarise, and as publishers add anti-scraping tools to their websites to prevent new material being pillaged in the same way. Some believe generative AI will improve its quality in the coming months and years, whereas others argue it’s already peaked.

Don’t approach Microsoft Copilot thinking it can do your job while you do something else – such attitudes could land you in a great deal of trouble. But if you’ve ever found yourself thinking “there must be an easier way to do this” while using a Microsoft package or utility, Copilot may be the answer to your frustrations. And if you don’t like the idea of generative AI speaking on behalf of your brand or business, you could always use the traditional method of employing a freelance writer to handle your content, editorial and journalism needs. Get in touch with G75 Media to see why even the best generative AI platforms will never match the nuance, humour and lived experience imbued into our award-winning copywriting services.

* No generative AI was used in the making of this blog.