Kaitlyn Dunnett/Kathy Lynn Emerson here. Like many of my fellow writers, I’ve recently been able to check a data base to find out if any of my books were used to “train” artificial intelligence to, essentially, replace live writers. Am I overstating that? I don’t think so. Neither do the screenwriters recently on strike, or their actor counterparts. Neither do several groups of writers who have banded together to launch class-action lawsuits over the matter.
For simplicity, I’ll just use the term AI rather than name companies or programs. In theory, AI is just a tool, but the possibilities for misuse are legion. Students using it to do their homework for them is one that’s been discussed for a while now. At least one program has already been designed to spot AI generated papers. But the real problem is that AI is unregulated. It is still too new to have checks and balances in place, let alone laws defining its limits.
It’s a big issue, in which I am now reluctantly involved. Without asking my permission, or offering me any payment for their use, the entities behind one or more programs for “creating” written works (business letters to novels to student papers to, potentially, anything involving the written word), fed two of the novels I wrote as Kaitlyn Dunnet (Scone Cold Dead and Ho-Ho-Homicide) and two penned as Kathy Lynn Emerson (Face Down below the Banqueting House and Murder in the Queen’s Wardrobe) into their system. They claim this is “fair use” of that material. Silly me—I always thought “fair use” was a term used to refer only to the fraction of a whole, along the lines of a paragraph or two from a novel, as quoted in a review.
Whether this is theft, piracy, plagiarism. copyright infringement, or fair use, this is a very personal issue to writers like me. Let me pose a hypothetical question. What if an AI user asks an AI program to write a novel “in the style of Kaitlyn Dunnett” and the result bears a strong resemblance to the seventeen novels I wrote under that name? It could very well happen. I already feel violated by the unauthorized use of my words. Creating an imitation Kaitlyn Dunnett novel strikes me as, at the least, theft of my intellectual property. If it’s poorly written, as it’s bound to be, it also tarnishes my reputation as a writer.
As it happens, fellow historical mystery writer Sharan Newman took this hypothetical situation a step further and asked an AI program to write a story in her style. The result was worse than she’d expected, “a pitch, with a hook at the end. But it was in my style and cadence.” It used the names of her series characters and placed them in situations that might have fit into the series, but their backstory was inaccurate. Sharan describes the text as taking “words and places” from her books and adding in some “popular plots.” In a Facebook post last week, she also reported that “this machine imitates my style but NOT my knowledge. Anyone reading one of these pastiches would think I knew nothing about medieval life.” Since she is a medieval scholar as well as a mystery writer, I don’t blame Sharan for being angry, especially after she asked AI for other stories and found they contained even worse mistakes than the first.
I don’t know what’s going to happen over this issue, but I do know that a whole lot of creative people are hopping mad at the thought that their unique output has been appropriated without their consent. Alex Reisner has made a study of the situation for The Atlantic. The two links below will take you to his articles on the subject, but you’ll have to sign in to read them for free. The gist is that he “acquired a data set of more than 191,000 books that were used without permission to train generative-AI systems by Meta, Bloomberg, and others.”
This article gives a brief summary of the findings:
where you can search, by author, to see what books were used to in the “training.” It’s slow and has a few glitches, but this is where the two screen shots illustrating this article came from. As an aside, since this is Maine Crime Writers, our late colleague Lea Wait has four of her mysteries in the data base. Although she’s no longer with us, her copyright extends many decades after her death, so that’s no excuse for using her work without permission.
This next link is to the Author Guild’s advice on what writers can do about the situation. Unfortunately, there isn’t much at the moment. The creators of AI can just ignore our letters and complaints until the courts decide the matter.
The two links below will take you to information on the class action lawsuits and what “fair use” means based on recent court cases
This second article makes the case that the argument that AI training is “fair use” will likely win in court
I hope this guy’s opinion is wrong. If it isn’t, I hope that some kind of regulation can be instituted to protect human writers. The screenwriters have a union. Those of us who write fiction and nonfiction for the print/e-book market do not. The Authors’ Guild is a professional organization, not a union. We can’t strike. We can be made redundant. I very much fear that AI will drive human writers out of writing, since it’s already difficult for us to make a living.
Books will still be published, but of what quality? If AI-generated books are allowed to proliferate unchecked, writers aren’t the only ones who will suffer. Readers will too.
Comments are welcome.
Kathy Lynn Emerson/Kaitlyn Dunnett has had sixty-four books traditionally published and has self published others. She won the Agatha Award and was an Anthony and Macavity finalist for best mystery nonfiction of 2008 for How to Write Killer Historical Mysteries and was an Agatha Award finalist in 2015 in the best mystery short story category. In 2023 she won the Lea Wait Award for “excellence and achievement” from the Maine Writers and Publishers Alliance. She was the Malice Domestic Guest of Honor in 2014. She is currently working on creating new omnibus e-book editions of her backlist titles. Her website is www.KathyLynnEmerson.com.