Earlier this week I took part in a workshop on AI and creativity hosted by the Institute for Ethics in AI at Oxford University. To metaphorically sum up the key challenge facing creatives, there’s an industry which wants to steal all our weapons and then use them to kill us. The stealing is that generative AI is building its models by scraping data from the Internet, including the works of creative people, without paying them. It’s then using this to generate content that could potentially put a lot of these creative people out of work
There’s a lot that has been said on these issues by people much more informed about the legal and technical aspects than me. One of those is Beeban Kidron, one of the organisers of the conference, who is doing a great job in the House of Lords trying to shape forthcoming government legislation. Still, after a day of very stimulating discussion I came away with five thoughts that I think deserve further consideration.
1. To be ahead of the curve, we have to realise we’re always going to be behind it.
The debates were having about AI are very much framed by the nature of the latest large language models (LLM) of generative AI. Even this is a hugely fast moving field. At the workshop, many people raised the problem that who knows what the next kind of AI that will come along will look like. While we’re trying frantically to design regulations to deal with the LLMs, it could well be there by the time they are in place the next lot of AI has already arrived and is radically different. So we could solve the problem of data scraping only to find that next generation AI doesn’t need to do it.
I’m neither a futurologist nor a computer programmer, but it seems clear enough that in an age of very fast developing technology, we can’t rely on a legislative framework which requires us to come up with very precise rules for how to regulate something in all the details of its actuality. What we really need are rules that are more similar to principles found in constitutions, or human rights and equality law. In other words, we need to decide at a more general level what it is we allow artificial artificial systems to do and what we don’t, and these principles must have a legal force which means that they instantly apply to whatever the new technology is. We can’t anticipate the specifics of future technology, so we shouldn’t legislate for them.
2. The end result may be inevitable, but how we manage the transition is not
Worries about AI taking work from creatives is often seen as a Luddite response to unstoppable technological change. On this analysis, AI is going to be writing music, composing novels and creating artwork and we will just have to get used to it. (I partly have: the image for this article is AI created.) But that stops the conversation too quickly. When there are major transitions in society it matters how we handle them. Think, for example, about the transition away from heavy industry which began in the 1970s and was accelerated in the 1980s under the Thatcher government. It was inevitable that a lot of these industries would contract and even close. But the manner in which that transition was handled was brutal. Communities were smashed apart, left without alternative work or sufficient social support. It could have been managed differently. Similarly, the debate now should be not so much about what we’re going to end up with, which is undoubtably more computer generated content, but how that transition is handled, not just for creatives, but for anyone whose trade or profession is threatened.
3. The plunder of creatives is nothing new
The worry about creatives not being paid needs to be put in context. It is hard imagine any deal with the big tech firms in which most individuals would receive more than a pittance. The closest model we have for such a system is the public lending right scheme, which pays authors a royalty on books loaned out by libraries. While the best-selling authors can earn over £6,000 from this, most annual payments are comfortably in three figures. The amount that creatives would get for use of their works by generative AI would probably be no more, and quite possibly a fair bit less.
The issue of payment is a sore one, not because writers are being deprived of great riches, but because we live in a culture that systematically financially undervalues the work of creatives. We are constantly being expected to work for little or nothing, often on the promise of “exposure”, “publicity” or “networking opportunities”. This is galling when the organisations that ask us to work for free pay everybody else involved. Television is worst for this. Crews, cleaners, presenters, camera operators and so on all get paid but guests on shows are often expected to be so thrilled to get their mugs on screen that they will appear for a train ticket and a stale sandwich. This is when the average income for a full-time writer is around £7,000 a year.
So the money we’re not getting from AI is just a drop in a wilder ocean of exploitation. It is not the main problem with AI, but it does shines a light on the way in which ceatives are not paid properly in general.
4. Fears of human redundancy are exaggerated.
A lot of people have faith in the idea that human creativity cannot be entirely replicated by machines. Some think this is because of an irreducible human soul or spirit, while others think it is just because AI can only build on what humans have already done, and so will always be playing catch-up. I wouldn’t want to bet on this. But even if AI could come up with something genuinely original and brilliant, I don’t think we’d be out of work. The value we place on creative work is not just its value as a piece of content, independent of its creation and creator. Think of food. I could buy you a cake from a bakery which would in a blind tasting beat any cake I could bake for you myself. But a lot of us would prefer to have that home-made cake. The story of its creation matters. You can’t literally taste the love in a cake, but you can appreciate it and knowing it is there can make you value the cake more. In the same kind of way, I can imagine a world in which there are some excellent pieces of art, music and literature produced by AI which we enjoy. But we would still very much value works created by humans. They give us a connection with each other that we won’t allow to disappear.
5. AI could make us up our game.
The challenge of AI is also a challenge for creatives to be more creative. The uncomfortable fact of the matter is that a lot of creative work is highly derivative. Indeed, some of it you might even call algorithmic – produced to a formula. Take most books on on popular psychology. Every chapter begins with an anecdote or vignette, them moves on to look at (usually the same old) studies, and ends by coming back to the story that started it with a fresh lens. Or think about podcast formats, with their chatty co-hosts, dragging out of the suspense, and generic mood-inducing music. When AI becomes able to do these things very well it will just expose what we should noticed already, which is the formulaic and uninteresting nature of a lot of creative work. So AI could be a spur to creatives to do better.
To meet the challenges of the future, we need to identify them. It’s very easy to assume we know what the biggest issues are. But very often, what is most salient is just what seems most immediate or obvious. As ever, we need to question our questions, if we are to find the right answers.