To Katrina McKinnon, artificial intelligence-powered writing tools are only as good as the operators that prompt them. And the best “prompt engineers” will be liberal arts folks, storytellers — not mathematicians.
McKinnon founded CopySmiths, a blog-writing agency, three years ago. She and I first spoke shortly after that launch, long before the rise of AI tools. We recently caught up, this time addressing writers for hire in a ChatGPT world.
The audio of our entire conversation is embedded below. The transcript is edited for length and clarity.
Eric Bandholz: What do you do?
Katrina McKinnon: I’m the founder of CopySmiths. We are an Australia-based blog writing agency for ecommerce stores. The company is three years old.
We work with writers worldwide, primarily in Kenya. They’re very willing and hungry to work. Unfortunately, Kenya has very high unemployment.
Bandholz: Are artificial intelligence tools impacting your business?
McKinnon: Just months ago, I was wary of AI. We had never used AI tools like Jasper or Frase. We once used Surfer SEO, but it left a fingerprint and created an expectation that content is easy.
Now we handcraft all content, but we use AI for ideas. We can direct it, for example, “Give me an outline on how to clean a fountain pen.” It’ll do that quite well, although there are minor details it doesn’t understand.
Another good use is complex product descriptions. We have a client that sells headlamps and head torches for mining activities. If you’re going into a quarry, an explosive environment, you don’t want your head torch sparking a fire. ChatGPT is brilliant at teasing out all the details. You can paste the product info into ChatGPT, and within seconds it will spit back, “This one’s got three batteries. The light lasts 24 hours.”
I don’t see ChatGPT becoming a search engine. It’s more of a discovery and research tool. We don’t use it to produce blog copy because it doesn’t go into the depth needed to show authority on a topic. We use it for summarizing and learning.
It’s good for creating tables. You can direct it, “Make a table comparing these vitamins and the effect on my skin, hair, or overall well-being and tell me the milligram dosage.” It can create quite complex tables with very different relationships in the data.
However, the facts are wrong 80% of the time. So it requires extensive human oversight.
Bandholz: Can Google distinguish between AI and human-written articles?
McKinnon: Yes. A Google employee recently tweeted that AI content is acceptable in the right scenarios. So you wouldn’t publish an opinion piece from ChatGPT, but you could certainly use a straightforward FAQ, such as how to replace the batteries.
ChatGPT is one of many AI writing tools. Perplexity.ai, for example, is a mix between Google search and Bing chat, functioning as a search engine that accurately cites sources.
Google search is broken. Its results are crammed full of low-quality Forbes articles. I spoke yesterday with a merchant who paid a Forbes author to include the company’s shoes in an article. So it’s pay-to-play. That’s not a practice I’m interested in.
Hopefully Google will evolve from a boring, dumb search engine.
I’m excited about the coming need for AI operators, which I call prompt engineers. Many marketers are quantitative and mathematical. They run pay-per-click, “data-driven” campaigns. That’s how they grow ecommerce revenue.
With prompt engineering, it will be the storytellers — liberal arts majors and communicators — who control the language models because they naturally think creatively with words. Over the next few years, we’ll see linguistically creative founders build massive businesses.
Bandholz: Where can our audience follow you?