Contempt, Not Infringement: What AI Took From Creative Life

On a wet Tuesday in March, in a rented rehearsal room above a kebab shop in Peckham, a four-piece called the Fen Wardens are arguing about whether to put their back catalogue on Suno.

Not on Suno as in upload for streaming. On Suno as in feed to the machine. Suno, the Boston-based generative music company, offers, through various licensed partners and less-licensed side doors, the ability to spin up new tracks in a recognisable style from a handful of text prompts. The Fen Wardens, who have spent eight years building a modestly devoted audience around a sound they describe, with some embarrassment, as “drone folk for people who can't sing”, know that somebody, somewhere, has almost certainly already fed their stuff to something. You can hear it, their bassist says, in the tracks that keep surfacing on certain playlists: the same sustained open fifths, the same hesitant vocal attack, the same way the reverb tails get cut off a fraction too early. Not their songs. The grammar of their songs.

The question on the table is whether they should, at this late stage, formally submit to a licensing scheme that would pay them something per play in exchange for the right to have been trained on. It would mean a few hundred pounds a month, maybe. It would also mean, as the drummer puts it, “signing the paperwork on the burglary after the fact”.

They vote three to one against. They then argue for another forty minutes about what to do instead, and eventually order more coffee, and nobody really knows. The room smells of damp coats and amplifier dust. Outside, the traffic on Rye Lane thickens into evening. Inside, four people who have spent roughly a decade of their working lives writing songs that sound like no one else's are trying to decide what it means that an algorithm has absorbed their particular strangeness and turned it into a style preset. It is not, quite, an existential crisis. It is something worse than that, because it has no clean edges. It is an unsettling.

Multiply the Fen Wardens by every working creative on the planet and you have the shape of the 2026 cultural mood.

The Lawsuits, and the Bigger Question Underneath Them

The legal front is now so crowded it has begun to resemble a weather system. The New York Times' infringement suit against OpenAI and Microsoft, filed in late 2023, survived OpenAI's motion to dismiss in March 2025 and has since ground through a discovery war of such intensity that Judge Sidney Stein of the Southern District of New York ordered, in an affirmation of an earlier magistrate's ruling, that OpenAI hand over a sample of twenty million anonymised ChatGPT conversation logs to the plaintiffs. OpenAI had wanted to select a handful of conversations implicating the plaintiffs' works. The court said no. Summary judgment briefing has concluded. A trial looms.

In June 2025, in the Northern District of California, Judge William Alsup handed down the first substantive American ruling on whether training a large language model on books constitutes fair use. His answer, in Bartz v. Anthropic, was a carefully qualified yes: ingesting legitimately acquired books to train Claude was, Alsup wrote, “exceedingly transformative”. But he drew a hard line at the pirated sources, the LibGen and Books3 mirrors from which Anthropic, like most of the industry, had helped itself in the earlier, messier years. That part, Alsup ruled, was not fair use. By August, Anthropic had agreed to pay roughly $1.5 billion to settle the class action, with about $3,000 per book flowing to the authors of some half-million works. It is the largest copyright settlement in American history. It also neatly split the future of the question: train on what you've bought, and you may be protected; train on what you stole, and you will pay.

On the other side of the Atlantic, the UK's High Court delivered its own first-of-its-kind judgment in November 2025 in Getty Images v. Stability AI, and rejected most of Getty's copyright claims on the narrow ground that the trained model weights of Stable Diffusion were not themselves “copies” of the training images, and that the training itself had not occurred on British soil. Getty salvaged a limited trademark win. The broader question, whether scraping copyrighted images to train a generative model is lawful under the Copyright, Designs and Patents Act, was not answered, because the court said it did not have to answer it.

And then there is Google. In January 2026, Hachette Book Group and the educational publisher Cengage filed a motion to intervene in a proposed class action alleging that Google had ingested their books and textbooks into its Gemini models without licence or consent. It was, in copyright terms, a comparatively narrow move. In cultural terms, it was a thunderclap, because it dragged the biggest, quietest player in the training-data story into the same dock as OpenAI and Anthropic. David Shelley, the chief executive of Hachette, gave a long interview to Fortune that ran the week before this article went to press. The headline, in the kind of flat declarative font Fortune reserves for what it considers the real story, read: Who owns ideas in the AI age?

Shelley's answer, extracted from a longer and more patient conversation, was characteristically British about it. Copyright law, he argued, is not broken. It is a very old, very well-tuned instrument. It needs “a slight evolution”. The end state, he said, is one where the people who have the ideas get to benefit from the ideas. That is the bargain, the compact, the deal.

The journalist who wrote the piece noted, without editorialising, that the CEO of one of the Big Five publishing houses had effectively become the public face of a creative-industry legal strategy. The quiet part had been said aloud. The question was no longer whether the AI companies had an obligation to ask. The question was what kind of civilisation you get when the answer is consistently, reflexively no.

What It Feels Like From Inside the Work

Every piece written about the lawsuits inevitably leaves out the thing that is actually happening to people.

The thing that is actually happening is a low, persistent weirdness. It is the session musician in Nashville logging into a stock music marketplace and finding an AI-generated track credited to “Artist” in her exact idiom, down to the pedal-steel inflections she has spent fifteen years refining, priced at the royalty-free equivalent of two pounds fifty. It is the illustrator in Brighton who, having removed her portfolio from every platform she could find after the Stable Diffusion scrape, opens a children's book in Waterstones and spends twenty uncomfortable seconds staring at an interior illustration that has her colour palette, her line weight, her characteristic trick of drawing rabbits with slightly too-large front paws, and wondering whether she is being paranoid or whether she is correct. It is the technical writer whose Stack Overflow answers, rewarded with internet points over a decade of unpaid labour, now surface inside a coding assistant that is being sold to her own employer as a replacement for technical writers.

None of these are lawsuits. None of them are falsifiable in any clean way. But they are the texture of the moment, and the texture is what the reporting keeps missing. Creative people are not primarily upset that their work was used. They are upset that they were not asked. The asking is the thing. The asking is most of what the bargain was.

Publishers can frame this in the language of licences and rights holders, because that is the language they have. Musicians can frame it in the language of mechanical royalties and neighbouring rights, for the same reason. But when you talk to working writers, painters, game designers, session singers, open source maintainers, translators, voice actors, documentary researchers, the language they reach for is smaller and older and more awkward. They talk about being taken for granted. They talk about the feeling of walking into a room where a conversation is already under way about you, and realising the conversation has been going on for years.

There is a word for that feeling, and the word is not “infringement”. The word is “contempt”.

The Compact That Nobody Wrote Down

The implicit bargain of cultural production has never been written down in full, because if you tried to write it down it would sound either sentimental or self-important, and it was the kind of bargain that could only work if everyone involved pretended not to see its edges. Broadly, though, it went like this.

You made a thing. The thing belonged to you, in a rough and contested sense, for long enough to matter. If anyone wanted to use it, they had to ask. The asking might be formal, a rights clearance letter from a publisher, or informal, a friend in another band wanting to cover your song. Either way it conferred a small dignity on the maker, a recognition that the thing had not simply fallen out of the sky. In return, you did not charge too much. You let schools teach your work. You let libraries lend it. You let cover bands play it in pubs for beer money. You let fanfiction writers do terrifying things to your characters in the knowledge that the terrifying things were love. The system leaked at every seam, and the leaking was the point. It was a commons protected by a fence that nobody checked too carefully.

Inside that fence, a whole ecology of intermediate institutions made creative life materially possible: small presses, writers' rooms, workshops, residencies, studio darkrooms, fanzines, open-mic nights, reading series, folk clubs, scratch nights, the back rooms of pubs and the front rooms of community centres. Nobody inside those rooms thought of themselves as maintaining a civilisation. They thought of themselves as paying the rent. But the cumulative effect of their improvisation was a civilisation, or at least the small, bright, warm portion of one that most people mean when they say “the arts”.

The AI training regime, as practised through the long grey years before 2024, did not break any specific clause of that bargain. It broke something smaller and more corrosive: the habit of asking. The habit was load-bearing. The habit was most of what dignity meant. Once you get into the practice of taking without asking, because the taking is so diffuse and so cheap that the asking has become economically irrational, you have changed what it means to make a thing and show it to anyone.

Shelley's framing, ownership of ideas, is a lawyer's framing. It is not wrong. It is also not where the damage is. The damage is that every working creative in 2026 now makes decisions about what to put into the world while running a continuous background calculation about what will happen to the work once it is out there. The calculation is not paranoid. It is correct. It is also corrosive to the conditions under which good work gets made.

Motivation, and the Floor Underneath It

Psychologists who study creative motivation tend to draw a line, usually in apologetic dotted pen, between intrinsic and extrinsic drivers. Intrinsic means you make the thing because making it is the point. Extrinsic means you make the thing because making it leads to something else: money, attention, tenure, a book deal, a festival slot. The standard finding, repeated in enough studies that it can fairly be called consensus, is that people do their best creative work when intrinsic motivation is primary and extrinsic reward is a floor rather than a ceiling. The floor matters. Nobody, or nobody sane, writes a novel because it will make them rich, but plenty of people would not write a novel if it guaranteed they would be poorer for having done so.

The interesting thing about the floor is that it does not have to be high. It has to be real. It has to be the kind of thing that lets you tell yourself, without lying, that the hours you are putting into the work are not purely a tax on your other life. A small press advance. A Patreon that covers studio rent. A grant that lets you take four weeks off the day job. Enough, in aggregate, to keep the calculation on the right side of ridiculous.

Here is the worry. The specific way the AI industry has gone about its business, scraping, training, releasing, marketing, and then lawyering its way through the consequences, has not collapsed the ceiling. The ceiling is still there. A small number of creative people, the ones already at scale, the ones with lawyers and agents and standing to negotiate licensing deals, are arguably going to do fine. What has collapsed, or is collapsing, is the floor. The floor was always held up by the thousands of small, unglamorous payments that flowed through the intermediate institutions: the stock-library cheque that kept the illustrator's lights on, the library lending rights payment that kept the novelist in Biros, the session fee that kept the singer eating. Those payments are now competing, directly, with outputs generated from models that learned how to generate those outputs by ingesting, without permission, the lifetime work of the people whose floor has just dropped.

It is not true that the AI companies intended this. It is also not particularly relevant that they did not intend it. The thing has been done. The question is what happens next to the people who made the substrate.

In the pessimistic reading, the intrinsic motivation holds up for a while, because it always does. The work is the work. Then, over a longer horizon, the attrition sets in. Not a dramatic exodus. A slow leaking away of the marginal cases, the people who were just about managing, the ones whose commitment required a background plausibility that the work could be, sometimes, paid for. They stop taking the commissions. They stop sending the pitches. They get other jobs, and tell themselves they will come back to it on weekends. Some of them do. Most of them do not. The culture does not collapse. It thins.

Thinning is harder to see than collapse. It is also harder to reverse.

Communities of Practice, and Why They Matter More Than the Lawsuits

If the lawsuits are the surface of this story, the deeper, slower story is happening in the communities of practice that sustain creative life, and whose collapse or survival will shape what the next twenty years of culture actually feel like.

Start with fanfiction. Archive of Our Own, the volunteer-run fanfiction repository, had its public scraping incident back in the early 2020s, when it emerged that its archive had been hoovered up into several large training datasets. The response from the community was, famously, to treat the problem as primarily cultural rather than legal. Writers posted warnings, added deliberate nonsense tokens, set up opt-out campaigns, and, in a few corners, simply locked their work behind registration walls. The interesting part is what happened to the culture behind the walls. Fanfiction communities, historically one of the most generous and promiscuously sharing spaces on the open internet, started, for the first time in a generation, to feel private. Not secretive. Private. The distinction is subtle and enormous.

You can see the same thing in the open source software world. GitHub's Copilot, trained on the public corpus of open source code, set off a long argument about whether software licences that required attribution had been silently invalidated by the training process. The argument is still grinding through the courts. Culturally, though, the argument was already over by the time it started. Maintainers of public repositories began, quietly, to audit what they were willing to put into the commons. Some moved to more restrictive licences. Some started charging for access. Some, the ones whose politics had always inclined them towards openness, made peace with the fact that their work was now training machines and carried on. But the unreflective generosity that used to characterise the culture, the assumption that throwing your code over the wall was a contribution to a shared good, became harder to sustain. The shared good felt less shared.

Then there are the small presses and indie music labels and regional theatre companies and local newspaper arts desks, the institutional capillaries without which creative life does not move. These are not, on the whole, places with lawyers. They are places with one and a half staff members and a kettle. Their response to the AI training regime has largely been to ignore it, not because they do not care, but because the operational cost of caring is higher than they can bear. Several of the people running these institutions, when asked what they thought about any of this, gave some version of the same answer: we are too tired to be angry about it, and even if we were angry we would not know who to be angry at.

That is not resignation. It is triage. And triage, over time, is how capillaries close.

Workshops and apprenticeships, the traditional routes by which craft is passed between generations, are also struggling. Not because the teaching has got worse. Because the people who would otherwise be teaching, the mid-career professionals whose income and attention would be going into those rooms, are now under the kind of economic pressure that makes unpaid mentoring feel like a luxury. The tutors at a reputable London illustration school, speaking on background, described a noticeable fall in applications over the past eighteen months. The trend is not catastrophic. It is, again, a thinning.

And in music, below the level of the big lawsuits and the Universal-Udio settlement and the Warner-Suno partnership, there is a quieter conversation about the session musician layer, the thousand invisible players whose takes are the substrate of commercial music, and who have spent the last two years watching their demo work disappear into generative tools without any compensation mechanism that any of them can see. The Musicians' Union in the UK has been collecting reports. The reports are repetitive. They describe the same small dignity being taken, in the same small way, a thousand times.

This is the thing that neither copyright law nor the current framing of the lawsuits is equipped to see. Creative life is not, for the most part, a matter of famous authors and named illustrators and platinum-selling artists. It is the dense mesh of people working just above and just below the water line, whose labour is load-bearing for the visible culture but whose names never appear in court filings. When the floor drops on them, the lawsuits are too late.

Possible Futures, Some of Them Useful

There are, roughly, five things that could happen next. Most of them will happen in some degree, to different populations, at different speeds. None of them alone is sufficient.

The first is licensing. The Anthropic settlement, the Udio-Universal deal, the Warner-Suno partnership, and the emerging Google intervention are all variations on the same idea: the training data gets paid for, retroactively or prospectively, through some structured arrangement between rights holders and model developers. This is the future the publishers want, and it is almost certainly the future that the law, after enough grinding, will deliver. It is not the future the smaller creatives will particularly benefit from, unless the licensing schemes are designed with unusual care to flow money down the long tail. The default of big licensing deals is that the big players get paid. The Fen Wardens do not.

The second is collective bargaining. Unions and guilds, which had begun to organise around AI issues before the lawsuits even started, are now pressing for the kind of sector-wide agreements that treat training data as a bargainable object rather than a scraped commodity. The Writers Guild of America's 2023 contract was the template, and its AI provisions, negotiated in the aftermath of a strike most people thought was about something else, turned out to be load-bearing in a way nobody fully appreciated at the time. Variations on that approach are working their way through SAG-AFTRA, through the Authors Guild, through the European federations of translators, and through the musicians' unions. Collective bargaining will probably do more concrete good for the marginal cases than any lawsuit, because it forces the negotiation to happen at the level of the labour rather than the level of the individual work.

The third is the opt-out registry, the technical fix the UK government flirted with during its text and data mining exception consultation. The government's original preferred option, a broad TDM exception with rights-holder opt-out, was eviscerated in the consultation response, with eighty-eight per cent of respondents backing a requirement for licences in all cases and only three per cent backing the government's preferred option. The March 2026 progress report effectively shelved the opt-out approach as the preferred option, though nobody thinks the idea is dead. Opt-out registries have an obvious appeal: they seem to give creators a switch. The problem is that the switch only exists for people who know the switch exists, and the people who most need protection are the ones least likely to hear about the scheme before their work has already been ingested. Opt-out, in the absence of a robust opt-in default, is a solution that works best for the people who need it least.

The fourth is a new patronage economy, which is the optimistic way of describing something that is already happening, unevenly, on Patreon and Substack and Bandcamp and the direct-to-audience platforms that have been quietly absorbing the refugees of the legacy creative industries. The patronage model is not new. What is new is the scale at which it is becoming necessary, and the extent to which it requires creatives to become their own marketing departments, customer service agents, and community managers. The work of sustaining the work has, for many, become more time-consuming than the work itself. This is bearable for a subset of temperaments and impossible for others. It favours the extroverted, the photogenic, and the voluble. It punishes the people whose contribution to culture was to sit in a room for ten hours a day being quiet.

The fifth, and this is the one most people are reluctant to say out loud, is retreat. A return to analogue, semi-private, and deliberately offline spaces. The vinyl resurgence is not a coincidence. Neither is the small but persistent wave of writers who are deliberately keeping certain projects off the web entirely, circulating them only through physical printings and invitation-only reading groups. Neither is the rise of zines, the re-emergence of mail art, the tiny but passionate return of letterpress. None of this is going to become a mass movement. All of it is a signal. When the open commons becomes unsafe, creative life retreats to the rooms where the door can still be closed. The rooms are smaller. They are also, for the people in them, real.

Back in the Rehearsal Room

The Fen Wardens, when I spoke to them a week after their Peckham meeting, had made a decision of sorts. They were going to keep putting the music out. They were going to stop streaming it on the platforms whose terms of service they no longer trusted. They were going to press a small run of vinyl for the next record. They were going to send the CDs to a handful of independent radio stations that they had a personal relationship with. They were going to play more live shows, including the kind of tiny, uneconomic shows in village halls and community centres that they had mostly stopped doing in favour of festivals. They were going to use Bandcamp for digital because Bandcamp still felt, to them, like an institution run by people who knew that the music belonged to someone. They were, in short, going to get smaller and more local and more stubborn.

They were not doing this because they thought it would scale. They were doing it because the alternative, which was to carry on as before whilst pretending the bargain had not changed, felt to them like lying to themselves about their own working life. One of them used the word dignity. The others winced slightly at the word, because creative people do not like talking about dignity in public, and then nodded.

What the Hachette CEO said to Fortune is true. The central question is who owns ideas in the AI age. But the question underneath the question, the one the lawsuits are structurally incapable of asking, is whether the conditions under which people are willing to keep having ideas in the first place can survive the next decade of industrial extraction. Copyright law can compensate creators after the fact. It cannot restore the habit of asking. It cannot repair the small dignity of being recognised as the source of a thing. It cannot, on its own, rebuild the capillaries through which creative life actually flows.

What it might be able to do, if the lawsuits keep winning and the settlements keep getting bigger and the unions keep organising and the patronage economy keeps maturing and the capillaries hold, is buy enough time for the culture to work out a new compact. The new compact will not look like the old one. It will probably be more formalised, more transactional, more legible to machines. It will have fewer assumptions baked into it about goodwill and common sense. It will be worse, in the small ways that writing a thing down is always worse than a shared understanding. It will be necessary, in the way that fences become necessary after the first wave of trespassers proves that the old gentleman's agreement cannot hold.

The thing worth fighting for, in the meantime, is the rehearsal room above the kebab shop. Not as metaphor. As literal infrastructure. The room where four people are arguing about whether to sign the paperwork on the burglary is the room where the actual culture is being made, and if the room goes away because the people in it can no longer afford to be in it, no licensing scheme and no settlement cheque and no Fortune profile of a publisher's CEO is going to conjure it back. The thinning, once it has happened, is very difficult to unthin. Capillaries that close do not reliably reopen.

It is easy, in 2026, to mistake the lawsuits for the story. The lawsuits are important. They are also, in the deeper sense, downstream. The real story is the quiet meeting in the rented room, and the quieter calculation that every working creative is now running, every week, about whether the work is worth the work. The calculation has always existed. What has changed is the variable. The variable, for the first time in the history of cultural production, is the machine that learned to do what they do by studying what they did, without being asked, and is now being sold back to their audiences as an alternative to them.

Whether the people who made the substrate stay in the rooms is the only question that matters. The courts will not answer it. The companies will not answer it. Only the makers can answer it, and the way they answer it, one small stubborn decision at a time, is the shape the next culture will take.

The Fen Wardens pressed their record. The room above the kebab shop is still there.

For now, that is how the story ends. Not with a verdict. With a door that has not yet closed.


References & Sources

  1. Ashley Lutz, “Who owns ideas in the AI age?” Fortune, 8 April 2026. https://fortune.com/2026/04/08/hachette-ceo-david-shelley-publishing-google-copyright-lawsuit-ai-llm/
  2. NISO, “Cengage and Hachette File Motion to Join Class-Action Lawsuit Against Google”, February 2026. https://www.niso.org/niso-io/2026/02/cengage-and-hachette-file-motion-join-class-action-lawsuit-against-google
  3. Bobby Allyn, “Judge allows 'New York Times' copyright case against OpenAI to go forward”, NPR, 26 March 2025. https://www.npr.org/2025/03/26/nx-s1-5288157/new-york-times-openai-copyright-case-goes-forward
  4. Bloomberg Law, “OpenAI Must Turn Over 20 Million ChatGPT Logs, Judge Affirms”. https://news.bloomberglaw.com/ip-law/openai-must-turn-over-20-million-chatgpt-logs-judge-affirms
  5. Nelson Mullins, “From Copyright Case to AI Data Crisis: How The New York Times v. OpenAI Reshapes Companies' Data Governance and eDiscovery Strategy”. https://www.nelsonmullins.com/insights/blogs/corporate-governance-insights/all/from-copyright-case-to-ai-data-crisis-how-the-new-york-times-v-openai-reshapes-companies-data-governance-and-ediscovery-strategy
  6. Chloe Veltman, “Anthropic pays authors $1.5 billion to settle copyright infringement lawsuit”, NPR, 5 September 2025. https://www.npr.org/2025/09/05/nx-s1-5529404/anthropic-settlement-authors-copyright-ai
  7. Authors Guild, “What Authors Need to Know About the $1.5 Billion Anthropic Settlement”. https://authorsguild.org/advocacy/artificial-intelligence/what-authors-need-to-know-about-the-anthropic-settlement/
  8. Kluwer Copyright Blog, “The Bartz v. Anthropic Settlement: Understanding America's Largest Copyright Settlement”. https://legalblogs.wolterskluwer.com/copyright-blog/the-bartz-v-anthropic-settlement-understanding-americas-largest-copyright-settlement/
  9. Latham & Watkins, “Getty Images v. Stability AI: English High Court Rejects Secondary Copyright Claim”. https://www.lw.com/en/insights/getty-images-v-stability-ai-english-high-court-rejects-secondary-copyright-claim
  10. Bird & Bird, “Stability AI defeats Getty Images copyright claims in first of its kind dispute before the High Court”. https://www.twobirds.com/en/insights/2025/uk/stability-ai-defeats-getty-images-copyright-claims-in-first-of-its-kind-dispute-before-the-high-cour
  11. RIAA, “Record Companies Bring Landmark Cases for Responsible AI Against Suno and Udio in Boston and New York Federal Courts”. https://www.riaa.com/record-companies-bring-landmark-cases-for-responsible-ai-againstsuno-and-udio-in-boston-and-new-york-federal-courts-respectively/
  12. Copyright Alliance, “Top Noteworthy Copyright Stories from October 2025”. https://copyrightalliance.org/copyright-news-october-2025/
  13. UK Government, “Copyright and Artificial Intelligence” consultation document, December 2024 – February 2025. https://www.gov.uk/government/consultations/copyright-and-artificial-intelligence/copyright-and-artificial-intelligence
  14. UK Government, “Copyright and artificial intelligence statement of progress under Section 137 Data (Use and Access) Act”, 18 March 2026. https://www.gov.uk/government/publications/copyright-and-artificial-intelligence-progress-report/copyright-and-artificial-intelligence-statement-of-progress-under-section-137-data-use-and-access-act
  15. UCL Copyright Queries, “UK government publishes progress statement on AI and copyright consultation”, 23 December 2025. https://blogs.ucl.ac.uk/copyright/2025/12/23/uk-government-publishes-progress-statement-on-ai-and-copyright-consultation/
  16. Fieldfisher, “UK government maintains status quo on AI and copyright, playing the long game on potential reform”. https://www.fieldfisher.com/en/services/intellectual-property/intellectual-property-blog/uk-government-maintains-status-quo-on-ai-and-copyr

Tim Green

Tim Green UK-based Systems Theorist & Independent Technology Writer

Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.

His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.

ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk

Discuss...