by James Scott Bell
@jamesscottbell
This is a long one. Pack a lunch.
I hesitated to write yet another post about AI. The blogosphere is jam-packed with thoughts, opinions, screeds, exaltations, and jeremiads concerning this quantum leap in machine capacity.
For writers, there is no end of discussion on the use (or abuse) of AI in the writing of a book. For me, that consideration boils down to this: AI is an amazingly powerful tool—and time saver—for things like research, brainstorming, book descriptions, synopses, series bibles and so on. But where the rubber meets the road—or the words meet the page—is when the machine writes the actual prose that will appear in the final product issued under the author’s name. That is a line I refuse to cross. Why I don’t is the subject of this post.
Before I get to the main point, allow me a small excursus on writing for money. It’s okay to do that! The pulp writers of old wrote for money, especially during the Depression. They learned the formulae for romance, mystery, fantasy, and adventure. They churned out stories for which they were paid. That’s called free enterprise. The best of them added their unique voice to set them above the average pulpster. You had the sharp leanness of Dashiell Hammett, the thunder and lightning of Robert E. Howard, the cynical snap of Raymond Chandler.
But now AI can churn out formulaic novels with a few clicks of the Return key. Anyone with a modicum of computer sense (not really a “writer” if they don’t do the actual writing) can produce a product that can be sold online. Such a producer might turn out five, ten, fifteen books a month, in the hope that sheer volume will return some hearty ducats.
It’s possible to tell Chat or Grok or some other machine to come up with a complete outline for a fantasy, and then tell it to write scenes “in the style of” Brandon Sanderson or George R. R. Martin or Patrick Rothfuss and boom, a day or two later you’ve got a novel.
That’s legal. But is it ethical? I decided to ask Grok (**irony alert**). Here’s the result. (I’ve redacted author names as I did not secure permission to include them.)
Yes, there are writers using AI to produce novels quickly, sometimes in a matter of days, and some are attempting to mimic the styles of famous authors. This practice is becoming more common as AI writing tools improve, but its legitimacy—both in terms of ethics and quality—is a subject of debate.
Several documented cases highlight this trend. For instance _____, a science fiction writer, used AI tools like ChatGPT, Anthropic’s Claude, and Midjourney to create 97 mini-novels (up to 5,000 words each) in just nine months. Each book took him six to eight hours, a fraction of the time traditional writing might require. Similarly, _____ generated a novel … in three hours using 66 prompts, winning second prize in a 2023 sci-fi competition.
As for mimicking famous authors, AI tools like NovelAI and Sudowrite allow users to train models on specific styles or prompt them to emulate authors like Agatha Christie or Stephen King. [A 2023 novel] was composed with large language models, drawing on styles of detective fiction writers like Raymond Chandler. However, the results often require human editing, as AI struggles to maintain coherence over long narratives or fully capture the nuanced voice of a specific author.
Is This Legitimate?
The legitimacy depends on perspective:
- Creative and Practical Legitimacy: From a production standpoint, it’s legitimate in that it works—AI can generate readable content quickly. Some authors argue it enhances their creativity, acting as a tool to brainstorm and draft. _____, an indie author, uses Sudowrite to meet tight deadlines, producing a novel every nine weeks. Tools like Squibler and Jasper also market themselves to authors, offering features to generate outlines or full drafts in minutes. Readers often buy these works (selling for $1.99-$3.99), suggesting some market acceptance.
- Quality Concerns: Critics argue the output lacks depth. Significant human curation is needed to stitch AI outputs into a cohesive narrative. Off-the-shelf models like GPT or Claude often produce generic or inconsistent prose when tasked with mimicking complex styles, as noted by writers like _____, who found AI-generated text fell short of his voice.
- Ethical and Legal Issues: The ethics are murkier. Many authors, backed by the Authors Guild (over 10,000 signed a 2023 letter), argue that AI tools trained on copyrighted works without permission constitute theft. If a writer prompts AI to “write like Dan Brown,” it’s likely drawing on his style without consent, raising ethical questions. Legally, AI-generated works face copyright ambiguity—U.S. law traditionally requires human authorship, and the extent of human involvement (e.g., editing vs. pure generation) complicates ownership. There’s also the question of transparency: should authors disclose AI use?
- Cultural Reception: Some see it as a betrayal of the craft. Traditional authors value the slow, personal process of writing, as expressed in a 2023 Los Angeles Times piece, fearing AI erodes the human eccentricity that defines literature. Others view it as a practical evolution, akin to using a thesaurus, especially for indie authors under pressure to publish fast on platforms like Amazon Kindle Direct Publishing.
Conclusion
Yes, writers are using AI to churn out novels in days, sometimes in the style of famous authors, enabled by tools like ChatGPT, Sudowrite, and NovelAI. It’s technically feasible and has a niche audience, particularly in self-publishing. However, its legitimacy is contested: it’s a powerful tool for speed and ideation, but the quality often lags, and ethical-legal concerns linger. For now, it’s a hybrid process—AI accelerates drafting, but human skill is still key to polish and authenticity. Whether it’s “legitimate” depends on your stance: a pragmatic shortcut for some, a creative compromise for others.
Got all that? Want to put out a novel a day? You won’t get arrested. And maybe the actual output, with help, can produce competent fiction.
But I believe there’s something you will miss, something AI can never give you: truly and deeply experiencing the lives of the characters you lovingly bring to life on the page. A machine can’t hand you that feeling; it can only come from inside you, churning through the heart, flowing through your fingertips as you type the words. And when you finally type The End there’s a certain euphoria that is only bequeathed to the writer who has “opened a vein.” The lifeblood of a story can’t be given by transfusion. It has to come from within.
Ray Bradbury wrote in Zen in the Art of Writing:
If you are writing without zest, without gusto, without love, without fun, you’re only half a writer. It means you are so busy keeping one eye on the commercial market, or one year peeled for the avant-garde coterie, that you are not being yourself you don’t even know yourself. For the first thing writer should be is excited. He should be a thing of fevers and enthusiasms. Without such a vigor, he might as well be out picking peaches or digging ditches; God knows it’d be better for his health.
I’ve experienced zest with every novel I’ve finished. A few times the experience has been overpowering.
Case in point. I wrote a trilogy, the Ty Buchanan series. Over the course of these legal thrillers there’s a love story. When I typed the last line, the most perfect I’ve ever written (for me, at least) I burst into tears. I mean, just like that first scene in Romancing the Stone where Kathleen Turner, at her keyboard with headphones on, types the last word of her novel. Weeping and laughing she utters, “Oh, God, that’s good!” It happened to me because I both created and experienced every emotion of every character over a three-book span.
I will not trade away that feeling. Besides, I believe it has value for the reader, too. I believe most readers sense when a book’s been written from a vibrating human heart, or hasn’t. As Carl Sandburg once said, “No tears in the writer, no tears in the reader.”
Secondarily, I’m also wary of too much “cognitive offloading.” Another reason I write the words is to keep my brain in shape. If AI does that for me, my synapses stop firing. It’s like watching pickleball on TV every day instead of playing it yourself. Doesn’t do the body much good, does it? As one source puts it: “The long-term reliance on AI for cognitive offloading could also erode essential cognitive skills such as memory retention, analytical thinking, and problem-solving. As individuals increasingly rely on AI tools, their internal cognitive abilities may atrophy, leading to diminished long-term memory and cognitive health.”
I’ll finish with this. In my favorite movie, Shane, there’s a magnificent moment in the beginning where Shane, the mysterious stranger passing through, has been shown hospitality by the Starrett family—Joe, his wife Marian, and their boy, Joey. After a hearty meal, Shane excuses himself and goes outside. He’s about to express his gratitude without words. For in the yard is a big old stump that Joe has long been chopping away at.
Shane picks up an ax and starts hacking. Joe joins him and the two work into early evening.
They make their final push on the stump. It barely moves.
Joe’s wife sensibly suggests they hitch up a team of horses to pull it out. Joe says, “Marian, I’ve been fighting this stump off and on for two years. Use the team now and this stump could say it beat us. Sometimes nothing will do but your own sweat and muscle.”
Joe and Shane lay into that stump and with a final, mighty push, uproot it.
I guess I feel like Joe Starrett. There’s some things that won’t do for me as a writer but my own “sweat and muscle.”
I’ve gone on too long and I’m still thinking this all through. Please let me know your thoughts in the comments!
Here’s that scene from Shane: