by Debbie Burke
@burke_writer
Almost 20 years ago, a giant communications company decided to outsource their phone customer service to other countries. I learned about this from a friend who worked there. The company announced massive layoffs of employees because overseas labor costs were cheaper than using American workers.
Then, to add insult to injury, those employees whose jobs were being eliminated were required to train their replacements.
Not surprisingly, outsourcing didn’t work out too well. There was massive consumer backlash because neither the customers nor the new workers could understand each other on the phone. But the damage had been done. Thousands of American workers lost their jobs and the company’s reputation took a big hit that it never recovered from.
That kind of parallels today’s situation with writers and AI. Our work is being scraped from illegal pirate sites and used to “train” AI to replace us.
Some people joke that AI (artificial intelligence) is “artificial insemination.” Writers are being screwed without receiving any enjoyment. They didn’t even buy us dinner first.
The Authors Guild (AG) has been on the forefront to try to protect writers from unauthorized use of copyrighted works to train AI. In July, 2023, they sent an open letter to the CEOs of AI giants including Open AI, Meta, Microsoft, IBM, and others with a petition signed by 15,000 authors. AG also testified before the senate, decrying pirate sites that are used by tech companies to “train” AI models.
The genie is out of the bottle. AI is here to stay. The question now is: can the genie be forced to compensate writers for their words?
Here’s an excerpt from the Authors Guild statement on AI:
“The Authors Guild believes that the right to license a work for AI training belongs to the author of the work unless the rights are expressly granted in an agreement.”
A bill called “The Generative AI Copyright Disclosure Act of 2024” is under consideration by the House of Representatives. This only requires disclosure by anyone who uses copyrighted work to train AI. It does not address fair compensation for that use.
Recently Draft2Digital (D2D) did a survey among authors, publishers, and others to determine how they felt about the use of AI and what authors would consider fair compensation for use of their work. D2D CEO Kris Austin kindly gave permission to quote from the survey results (full results at this link).
Here are some highlights:
1. “Why do authors oppose AI training?”
AI companies are unethical/untrustworthy – 25%
Harms creatives & people – 25%
Ethical Objections to AI – 19%
Other Reasons – 14%
I worked hard for my work and it’s mine – 10%
AI has no place in creative work – 8%”
2. “Do authors consider current scraping methods fair use?”
It’s not fair use – 49%
Ethically questionable – 42%
Fair use – 5%
No opinion – 3%
3. “Do authors know that AI companies might be willing to pay for training data?”
Unaware – 57%
Aware – 38%
Unsure – 5%
4. “Are authors interested in the opportunity to sell their AI training rights?”
Yes – 31%
No – 25%
Maybe – 45%
5. “Does it matter to authors how the end product LLM (large language model) will be used?”
Yes, it matters. – 76 %
Not as long as I am compensated – 22%
No opinion – 2%
The next two questions concern whether authors would consider having their work used for non-competitive markets (places that would not affect the author’s income) and competitive markets (e.g. an AI-written mystery could sell on Amazon right next to your book but at a much lower price).
6. “If the use case is non-competitive, will authors consider selling their AI training rights?”
No Amount of money will ever be enough – 49.5%
Open to non-competitive opportunities – 50.5%
Would accept less than $100 per book – 11.1%
Only if $100 or more per book – 39.3%
Only if more than $5,000 per book – 14.1%”
7. “If the use case is competitive, will authors consider selling their AI training rights?”
No amount of money will ever be enough – 62.8%
Open to competitive opportunities – 37.2%
Would accept less than $100 a book – 6.3%
Only if $100 or more per book – 30.9%
Only if more than $5,000 per book – 15.8%
Here’s a summary of D2D’s position:
D2D’S STANCE
Until we see significant reforms, especially around greater contractual protections and transparency governing use, intellectual property protections, and rights restrictions, Draft2Digital will not offer AI rights licensing opportunities.
· It’s a positive development that AI developers are seeking to pay for licenses
· Better protections are needed before D2D or its publishers can entertain such licenses
· AI training rights are an exclusive, valuable subsidiary right under the sole control of the author or publisher
· The rights-holder deserves full control over decisions related to if, when, and how their books are used or licensed for AI training purposes.
· Authors and publishers should refuse AI rights licensing contracts that are opaque, or that provide inadequate protections for author concerns
· AI developers must stop training upon books obtained without the rights-holder’s permission; otherwise, they will face continued reputational harm in the eyes of their customers and the creative community
· LLMs previously trained upon unlicensed content, and the applications built upon them, should either negotiate retroactive licensing settlements with rights holders, or scrap their LLMs and rebuild them from scratch by training upon licensed content only”
“At this time, Draft2Digital will not offer AI rights licensing opportunities.”
I believe most authors agree that compensation should be paid and payment should be retroactive to include past unauthorized use.
The devil is in the details.
· How to implement systems that detect/determine use of copyrighted material?
· How to enforce fair use?
· How much are authors paid?
· What if an author doesn’t want their work used for AI training under any circumstances?
The communications company my friend worked for treated their employees shabbily but at least they told workers in advance that they had to train their replacements.
Authors and publishers were never told in advance. Tech giants simply started using creative works without permission nor compensation to the creators. AI-written works currently flood the marketplace that was already crowded. Our incomes suffer.
We study, rewrite, and work hard to create meaningful content and deserve fair compensation.
Those devilish details will be fought out in courts for years to come.
~~~
TKZers, how do you feel about AI’s use of your creative work to train LLMs?
Please share your answers to any or all of the questions.
~~~
Cover by Brian Hoffman
Debbie Burke writes her thrillers without AI.
Fruit of the Poisonous Tree is now available for preorder at this link.