Will AI kill the evening class?
For the last 13 years I’ve taught the Narrative Non Fiction short course at what is now City St George’s, University of London. The most certain thing I knew over most of those 13 years was that three terms a year the course would book up between 15 and 20 excellent students (the maximum allowed) whose work would go on in some cases to be published in leading outlets (Guardian, London Review of Books) and in books from Penguin, Faber, HarperCollins, Granta and other major publishers. The predictability of the seemingly endless flow of good students came to seem almost a law of nature.
At first my course was a live class at what was then City University in Northampton Square, London. The students came from all corners of the earth — London is a great global city after all — but they all lived with a 50 mile or so radius of London, some delaying their commute home to Brighton or Cambridge to attend a 6.30 pm course.
The classes went online during the pandemic; when the acute phase ended there was talk of a desire to return to live classes, but I found that the course worked even better online. Sometimes students defied difficult times zones to join from as far afield as Malaysia or Ethiopia. Online, I found I got to know the students quicker and the international array was exhilarating: one class had four Indian students, one in Birmingham, one in San Francisco and two in Mumbai.
A golden age for the class, but recently the booking figures at City St George’s, although still relatively good for my class, have followed what seems to be a general decline for all evening classes. It seems unlikely that a yearning for live is the cause — the biggest fall off is with those joining from abroad. So could AI be the cause? Writers are very publicly up in arms about publishers hoovering up authors’ texts to “train” AI; they have been highlighted as a profession at risk from AI, with whole copywriting departments already wiped out. Deeply shocked by this possibility, I wondered if people really did think that AI would be a better creative writing tutor than a human being? Or that autonomous AI would be writing all the non-fiction?
Of course, AI can produce fabulous research results and it can also produce tolerable text, but there’s a fallacy at work that goes right back to the famous Turing Test. Alan Turing was a great genius who not only invented the concept of the general purpose computer, was a codebreaker in WW2 and pioneered a theory of biological pattern development that was 50 years ahead of its time. But he nodded when he devised the eponymous test in which he suggested that if, in a test in which a human evaluator has to distinguish between a human and a machine through text-based conversation, the evaluator cannot reliably identify the machine, it should be considered to possess the same order of intelligence as a human being.
But the essential feature of being a human being vis a vis a computer is not comparative brain power — AI is clearly in purely cognitive terms far superior. The crucial difference is that a human is an animal with a 4-billion-year history, programmed to cling to life, to care and to hope and to love. AI is none of those things. If AI should ever develop something parallel to humanity, something like a soul, it is more likely to want to produce better ways of destroying things than preserving them, as the current manifestations demonstrate: deep fakes like Trump’s kitsch Gaza plaza and other porn, locating military targets, spreading disinformation, committing crimes, wiping out human jobs. Our task is to stay human in the face of this challenge — one as great as that of global heating.
To return to narrative non-fiction. In the books I’ve written, the best passages derive from what I learnt by following my nose after the contract was awarded. The proposal was just a peg on which to hang the nuggets I would later unearth. Of course AI would also find nuggets but they would be more predictable because mine come from a particular lifetime of wide reading that no AI could have followed. I always tell my students that it is what they bring to the table that matters in making their work distinctive. We all have favourite writers, those whom we read whatever they choose to write about. We love them because they are particular human beings whose way of looking at the world has helped us to find ourselves. This can never come from AI. It can pastiche our favourite writers but not create from code an original like George Orwell or a Marina Hyde.
I’m writing this having just read Barney Ronay’s piece on Liverpool winning the Premier League title. At one point he writes:
“Even the trains north had carried a gathering excitement, like a Whitsun wedding weekend, whoops and skirls down the platform at each stopping point, the feeling of Liverpool spread out in the sun up ahead.”
The “whoops and skirls” and a city “spread out in the sun” come from Philip Larkin’s best-loved poem ‘The Whitsun Weddings’ (the city in the poem is London). AI might light upon a few of the predilections and quirky discoveries of a creative writer but not the whole package. AI would discover my penchant for the poetry of Louis MacNeice but the connection I make in my latest book between his poem Autumn Journal and the viral origin of the human placenta? And my class is not primarily about what I teach them but the chance coming together of writers from all over the world who spend 10 weeks absorbing each other’s worldviews and writing styles in a sympathetic atmosphere that one student compared to: “sitting in a circle with a group of friends around a pot-bellied stove in a delicious old bookshop somewhere”. The rush to AI is a shameful throwing-in of the towel, an abdication of being human. Do not go gentle into that AI night.
Peter Forbes teaches the Narrative Non-Fiction course at City St George’s University of London. His new book Thinking Small and Large: How Microbes Made and Can Save Our World is published by Icon on May 22nd.
