AI writes fast. It also fakes fast.

If you haven’t used ChatGPT’s Deep Research tool yet, get ready to have your hair blown back, and then set on fire.

On Saturday morning, I asked the tool to do some research for an article I wrote on the benefits of private debt for institutional investors. I gave it an earlier draft, then clicked “Send.” In 13 minutes, it cheerily produced a 3,000-word research paper, complete with copious sources, tables, pull quotes, and a full bibliography. It cited only reputable sources like PitchBook, Deloitte, and McKinsey.

I copy-pasted that paper into a Google Doc, downloaded it, and fed it into a custom GPT designed to write articles for the client, complete with their brand guidelines and corporate details. In about two minutes it created a 1,200-word blog post that, for all intents and purposes, appeared to outperform anything a human writer could do without spending two days on craft, and invoicing over $1,000.

But there was one glaring problem.

The research was a joke.

The Deep Research AI gave me copy like:

According to CIO Nicole Musicco, the move was driven by the ability to deliver “steady distributions, real downside protection, and better relative value than equity.” The company reallocated billions and sent a clear signal that institutional portfolios are rethinking how they can drive innovation and income at the same time.

That’s not bad. If a human writer gave me that, I wouldn’t blink. But when I asked for a source link, the custom GPT did a little finger pointing to the research document. And the sources referenced by the research document didn’t contain any such quote. Nor did I find it anywhere online.

The same thing happened with four other direct quotes and exact statistics in the same article section. When I asked ChatGPT what happened, it apologized and hedged. In other words, ChatGPTs vaunted deep research tool had made up its quotes and data.

Used by a trusting editor, AI has the potential to destroy your brand credibility overnight. In 2024, a reporter from Cody Enterprise in Wyoming resigned after he used AI to fabricate quotes and details in his stories. His deception was outed, and he admitted to using AI. But he — and his paper — didn’t know his article had fake news in it. Their only real crime was trust.

Thankfully, working with the GPT and clarifying my needs, I was able to retool my own article with verified sources and accurate pull quotes and data. And it didn’t take two days, or even half a day. It was more like half an hour.

There are two key takeaways here.

First, for content managers, marketing managers, and editors: Vanilla AI almost always hides a poison pill. You can’t, can’t, can not trust that what it gives you is true. You must put an expert writer in the process. Someone who knows how to find and verify information from respected sources and how to weave it into a compelling narrative in the right voice, tone, and engagement level. That person needs to be there to guide the AI and tell it where and when it wanders off the rails, and show it what to do instead.

Second, if you’re a writer who is horrified (like most writers I see online today) by the rise of AI and the death of professional writing, you have a clear and exciting way forward. I think it’s more than a little glib and insensitive to say you can get a job as a “prompt writer.” I think that cheapens the years (and possibly decades) of work you’ve put into learning your craft.

I also think there’s a real danger of experienced writers deciding their skills have been rendered null and void. That’s anything but true.

The real path forward is to be the expert guiding the machine at every step along the way. In my view, the way to think of it is that you likely won’t command the same rate you used to for a 1,500-word article, but you might not need to.

Writers — and especially freelance writers — have always targeted a certain hourly rate. I believe that by working as an expert human in the loop, you can still earn that rate, guiding AI tools to write high-quality copy, and potentially higher-quality copy, faster than was possible before AI came along. In other words, writers are increasingly wary of AI taking their jobs, and publishers should certainly be excited by the efficiencies it offers. But both parties should fight hard to put a human expert in the driver’s seat.

How are you using AI to enhance your content creation, and what role do you see for human experts?

Tom Gerencer

Lead GPT Trainer and Editorial Director, Wetware

Previous
Previous

Move fast and break everything?

Next
Next

Why AI Is Like Working With Rain Man