Greyhound Literary

On AI: Losing a Human Touch - Sam Edenborough writes

This article was first published in The Bookseller on 18 April 2023

I avoided the temptation to ask Chat GPT to write this piece, but I couldn’t resist asking it to do some creative writing to see for myself what sort of literary chops it has. The prompt “write a short story featuring a small goat and a complicated marriage” produced a genuinely amusing tale, with dialogue and characterisation. I was shocked by its fluency and coherence.

What are the implications of this technology for the publishing business? Will the art of researching and writing books be assisted or replaced by clever code? Will literary translators be made redundant? Can I outsource writing book pitches and finding good comp titles? Will editors and scouts begin to rely on AI reports on manuscripts under consideration?

With one major exception that I will come to shortly, I am leaning towards “no” in answer to all these questions because the natural language models currently being developed are trained on pre-existing material (including, regressively, their own output and that of other models). While they are already very good at producing plausible-looking content, they are incapable of genuine invention. The term “Artificial Intelligence” used to describe a natural language model is misleading because it encourages us to think of the system as having some sort of self-awareness. It does not. That is a feature of AGI (Artificial General Intelligence) which remains a distant goal, well beyond the current reach of the tech companies. Chat GPT and Bard’s outputs are the blended product of vast data sets and extremely clever modelling of the way that language works - but these systems have no ability to test whether their output is factually correct or meaningful, let alone beautiful. There are many ways in which these tools could be useful to a writer or a publisher, including summarising text and structuring arguments. But for now at least they are only tools, with significant flaws and limitations. They are not engines of invention.

There are intriguing questions relating to copyright law and the ownership of the texts that natural language models produce. I was relieved to find that the Terms of Use for Open AI, who created Chat GPT and GPT-4, assign ownership of the copyright in their models’ output to the user. This seems fair. If the user enters an original text as a prompt (in which they own the copyright as a matter of course) then the response from the chatbot should be their property, too.

The legal situation becomes more complex if a user copy-pastes all, or part, of a copyright work into the prompt. When I tried this with the first stanza of ‘The Waste Land’, the bot correctly identified and attributed the poem, and gave a coherent commentary on the text. But OpenAI’s systems can’t recognise every copyright work (not least those published after 2021, which is the horizon for Chat GPT’s dataset ‘memory’ of the world). The potential for parasitic, derivative or even infringing ‘new’ texts being created via prompts featuring in-copyright material is real. Chat GPT happily produced a 400-word poem when I prompted it to write one “featuring substantial elements of The Waste Land by T.S. Eliot”. The result dutifully included gobbets of the original, regurgitated in doggerel worthy of William McGonagall: “The sun beats down on this barren land / Where dead trees offer no shade / and the dry stones offer no helping hand / The Waste Land has us all enslaved.”

There are murkier issues here. What use does OpenAI make of in-copyright books in order to train its natural language models? Has it acquired a licence to use any? I have not been able to find definitive answers to these questions, though I can guess what they are. After substantial lobbying from the tech industry, the UK government recently proposed an exception to copyright law that would allow unlimited, free data mining of copyright material, including for commercial purposes. Happily the creative industries pushed back hard and it seems that a more balanced policy approach will be taken, recognising the importance and role of licensing in data mining activities and the rights of creators.

Translation software, on the other hand, poses a more significant threat to translators’ livelihoods in the nearer term than chatbots do to authors’ and publishers’ incomes.

I can foresee a dismal future in which the act of book translation is free, instantaneous, and trivially easy for a reader to deploy. In some ways it’s here already, although I don’t think many people would tolerate the experience of reading a whole novel that’s been run through DeepL or Google Translate.

The art of literary translation is as challenging a task for a software system as the art of writing the original text, in terms of the cognitive skills required, and only AGI will be able to do it as well as a human can. But we can’t be sure to what extent readers will prize the human touch. It won’t be long before ‘AI’ translation software can produce a text that is sufficiently tolerable for a significant number of consumers who will not care that it is less well written in their language than the original. This tipping point will determine how much publishers will continue to invest in human translators.

Recently I heard a rumour that one global publishing group is already using translation software to help them publish books for which they’ve acquired the rights in other languages. This prompted me to update our agency’s boilerplate translation licence to require the publisher to appoint a suitably qualified human translator to carry out the work. I wonder how long it will be before a publisher tells me that this requirement is unacceptable?


Here is ChatGPT's composition:

Write a short story featuring a small goat and a complicated marriage

Once upon a time, there was a small goat named Gracie. She lived on a farm with her owners, a married couple named James and Sarah. James and Sarah's marriage had become complicated over the years, with both of them feeling stuck in their routines and resentful of each other's behavior.

One day, while Sarah was out in the fields tending to the crops, she noticed Gracie watching her from a nearby fence. The goat's soft brown eyes seemed to be looking straight into Sarah's soul, and she felt a sudden urge to confide in the animal.

"Gracie, I feel like my marriage is falling apart," Sarah said, tears welling up in her eyes. "James and I just can't seem to communicate anymore, and I don't know what to do."

Gracie bleated softly in response and nuzzled her head against Sarah's leg. As Sarah stroked the goat's soft fur, she felt a sense of comfort and understanding that she had been missing in her relationship with her husband.

Over the next few weeks, Sarah found herself spending more and more time with Gracie. She would talk to the goat about her problems, and Gracie would listen patiently, her warm presence soothing Sarah's troubled heart.

Meanwhile, James began to notice a change in his wife. She seemed more distant and preoccupied, and he felt increasingly frustrated and angry with her. One day, he came across Sarah and Gracie sitting together in the field, and he exploded.

"What are you doing, Sarah? Talking to a goat like it's your therapist?" he yelled.

But Sarah didn't back down. "Gracie may be a goat, but she's also a friend," she said, her voice surprisingly steady. "And right now, I need her more than I need you."

James stormed off, but Sarah and Gracie remained together, their bond growing stronger with each passing day. Eventually, Sarah made the difficult decision to leave her husband and start a new life with Gracie by her side. As she drove away from the farm, she looked back at the little goat in the rearview mirror and knew that everything would be okay, as long as they had each other.