Is your writing original? Probably not… but that’s a good thing (The Lesson movie review)

Is your writing original? Probably not… but that’s a good thing (The Lesson movie review)

I love watching movies about writers and writing, so I binged The Lesson movie one lazy afternoon. It starts with some great quotes on writing and literature.

Here’s the setup: a famous writer hires a tutor for his son. So this wannabe writer (with a photographic memory) comes to live with his wife and kid. But secretly, he also wrote this thesis on this famous author’s work.

At first the famous author clowns it up.

“I know writers who obsess about how do I say something original. Am I saying anything unique, sorry there are no new ideas. Most writers are reconciled to the fact that and if they’re not they pretty well soon be. Average writers attempt originality and they fail, universally. Good writers have the sense to borrow from their betters. Great writers… great writers steal” (chuckles maliciously.)

He starts out like this, and this is the premise; he’s the bad guy though.

The tutor starts working for him and lightly commenting on his work; when he notices that the famous author has used a phrase from someone else. An innocuous phrase, but unique enough to stand out:

Empty space and points of light.

But during a late night theory session, he launches a defense of literary theft.

“Who owns my words? Words I chose because I believed they bore witness to a truth beyond fiction because when strung together they accorded to a music only I could hear, that resolved some conflict raging within me, who owns them? They belong to whoever chooses to read them, and since we are the words we write, so do we. Finders fucking keepers! Only gods come out of the machine. Ghosts stay in them.”

Who owns the copyright to your voice?

People can’t copyright a style or content ideas. I’ve covered literary a few times on this blog; I had to boot someone out of my writing group because a friend of mine found her book had been stolen (rewritten in a different style with the same plot events). And there are courses out there, from people telling authors to just take another bestselling book and rewrite it in a different style; terrible advice, but not strictly illegal since there is very little to be done about it.

And then there were controversies of authors lifting whole phrases from other books, piecemeal style (these authors blamed ghostwriters and vanished for awhile but came back). It’s a confusing, murky issue, but even though it’s easy to be morally outraged, it keeps happening because the risks of discovery or legal consequence are extremely low.

And also, even though the majority of writers are repulsed by the idea of copying (because they are so devoted to originality), professional, working authors also acknowledge that all art is copying, and it’s been like that for centuries. Steven Pressfield put up an article a few years ago called “Steal without Shame” – maybe a nod to Austin Kleon’s Steal like an artist.

On a personal note, I’m still angry that everyone adores “The War of Art” which claims “To labor in the arts for any reason other than love is prostitution.” Creatives love this book, because it encourages them to follow their bliss and keep doing the work, and avoiding the nuisance that nobody else likes or wants it. It’s created an army of ardent writers who call me a hack because I try to write books people enjoy reading…. even though the book itself was stitched together to become the bestseller it is by an editor, and the author is openly arbitrary about his views on the subject.

The avoiding of any external influences is NOT the way to make great art. But we treat it as a vile encroachment to creative purity. (For example, in this meme I just revisited, the 6 types of writers, 5 are “failed” and the only successful one, focused on the business, is called “greasy palms”.)

Creative businesses work like this: 99% of people fail because unlike the greasy palms, they aren’t trying to make things that matter, that audiences love. They are chasing passion, enjoyment and personal satisfaction. Sometimes, rarely, this can accidentally become successful (but always, because it aligns with what resonates with the common public, or is championed by influencers for their personal gain).


1. artists create (copying subconsciously)

2. the entrepreneurs copy on purpose. They see what works and copy it.

Like the movie says, most creatives start as artists and either fail and quit or pivot and chase trends. They make everything, copying everybody, until they get a unique and strong reaction to their work – and then they keep doing that and it becomes their thing.

There are competent and incompetent entrepreneurs.

But they tend to succeed more than the artists.

It mostly comes down to quality; can you do better with quantity over quality. Entrepreneurs don’t have the skills, which means paying for services or hiring ghostwriters, and skill is valuable and expensive, because it takes a long time to develop. So the chances of fate skew back towards creatives, on the assumption that writing is hard and you can either write Fast or Good, and these are polar opposites (which frankly, is ridiculous in itself, because in every other skillset, practice always improves both quality and efficiency).

But since writing is hard and doesn’t pay that much, why not chase the dream and spend years getting good, because scammers and spammers will never produce good enough content in enough massive quantity to be a real threat to real artists.

But what if you could skip all that work and generate good, fast, cheap content?

Well, for the first time ever in human history, now you can.

With new AI tools like chatGPT, which can copy or clone anybody’s writing style or even visual art, everything is changing fast. Many big creative platforms like Kickstarter are trying to ban AI art or writing; but for the most part, the even bigger design or creative companies have embraced it rather swiftly.

Canva, stock photo sites, Adobe, not to mention Amazon, Google, Facebook and Microsoft have gotten on board so fast we already have working AI tools inside of the eternal desktop software we’ve been using for decades.

AI will probably replace stock photo sites altogether, removing the need for photographers or stock artists or illustrators, but it will also probably replace the need to “photobash” things together with photoshop, and very soon it’s coming for text and typography, which makes most designers obsolete as well (right now it’s still murky and you should be careful if you want to use AI art in publishing).

There’s also been some tremendous pressure against the use of AI, but unfortunately this always targets individuals rather than institutions. Some recent stuff I’ve seen:

  • an author got her first book cancelled because her legit publisher purchased a stock photo from adobe that was made with AI (not illegal but unpopular enough to cause an outcry).
  • Jane Friedman had scammers writing crappy books and putting her name on them, “I would rather my books get pirated than this“; the issue here isn’t even necessarily AI related, since there are always issues with piracy online and it’s very difficult to prove something has actually been written by AI (all current tools universally fail and give false positives). The problem is, Amazon is too slow to take action and Jane had to make a huge fuss with her platform to see results – something I’ve also had to do – but for writers with no platform, the issues will most likely be ignored.
  • Stephen King came out in favor of AI (not really, he just said he doesn’t see the point of trying to stop new tech); others have argued he’s already too established for it to matter if the AI was trained on his books; or that he writes so fast he doesn’t need it.
  • A free writing tool that analyzed your writing tool got so badly trashed it had to be removed from the internet, since apparently it showed an analysis of lots of books that have never been made free and so shouldn’t really have been included. The legalities are murky to me here, since data scrapping internet stuff is not illegal; and Google has previously won lawsuits for all its books on Google play. The only point I hope to make is, it did not get removed through legal channels, and it wasn’t proven to have been actually illegal, it was cancelled due to the moral outcry – which may have been entirely justified (on an ethical level) – but I’m cautious of internet mobs; though of course authors always have the right to defend their work from all forms of perceived or actual piracy (unless, they want to do it through legal channels, because I do not think they have a case here).
  • There have been some big lawsuits though, for example one led by Sarah Silverman – the claim is that these big AI LLM’s have used copyrighted work they got illegally (without paying) and without permission – and that much is true. The question is, does the use of these AI tools fall under fair use, since they allegedly never copy the actual words of those books.

Here is where it gets murky: If I ask chatGPT to write a nonfiction book in the style of Malcolm Gladwell for example, it can absolutely do that (and that’s fair use). If I ask it to write a book review of one of Malcolm’s books and include 7 great quotes or excerpts, it can also do that… but that still counts as fair use (or all book review sites would be illegal). If I kept a hardcopy of a full book online that readers could download and access… that would not be OK, but in my understanding LLM’s do not do this.

A lot of tools that don’t even use AI are being targeted; but authors are right to be fearful – for example right now there’s a ton of spammy, poorly designed and poorly written books dominating Amazon categories (but, I’d actually guess they aren’t using AI or they would have been better). But it is true there is going to be a LOT more stuff out there, and even some of pretty great quality.

This is because, most authors first attempts at book writing are very poor, almost stereotypically bad. This is obvious because it takes time and practice for humans to learn things and they always make the same writing mistakes. 90% of self-published novels are less than mediocre and probably unreadable (which isn’t to say all self-published novels are bad! The successful ones are satisfying readers and pretty great; and the authors have talent and tenacity to have honed their craft to an appropriate level). The problem is, now unskilled writers with ambition can use AI tools to plot, write, and edit their own books and get a lot closer to above-mediocre… and we’re just getting started. In a year AI will be able to write great books; of better quality than most authors ever could in their lifetimes. Yes it’s terrifying (we went through this a year ago, with the AI art controversy.)

There seems to be two camps, the artists who are anti-AI and the techbros who are for it (the former repulsed by business and money, the latter soulless Cretans who have not a shred of artistic integrity or decency). But this is simplistic. In reality, all people are creative, everybody wants to make something great and also hopes to make some money at it, or at least be validated and appreciated for the time and effort they invested.

I’ll admit I’m probably all wrong about all of this, in that my stance has no personal benefit: I’m happy for any tools that make being creative easier and faster because I don’t believe that suffering for art is necessary or that it makes the work better (I used to, of course, but I stopped).

And I’m resigned to the fact that if it isn’t illegal, the market will dominate – and there’s far more people without skill who will be using AI tools, than there are talented and creative people who will eschew AI in all forms and become “performatively human” – which is a term I’ve been wrestling with on this blog (basically, should I use AI to write good, useful, well-written articles quickly, or continue my low-traffic, manic rambles of ADHD spirals). The impulse is to believe that Google will prioritize the obviously human content – full of grammar and spelling mistakes for example – but that’s not true because Google responds to user intent and satisfaction, just like any business. If AI is producing good enough content to keep consumers satisfied, consumers will drive the legalities of the issues.

I say “resigned” because it’s not like I trust the law of the people in charge, and maybe all the laws SHOULD be changed to preserve the human race, I just have no faith that will happen, so I think the fight against AI is unrealistic… but I can understand how that reeks of self-service apathy and the only right *moral* stance is probably defending human creativity against the machines.

I acknowledge the messy moral quandaries, and as a philosophy (and theology!) major I learned a long time ago to resist fundamental righteousness of any kind because I have no confidence that I’m ever universally right about the true nature of the universe. Which doesn’t make me amoral either: I believe in not hurting people, providing more good than harm, defending the weak against injustices or power imbalances… the tricky part is that AI tools are kind of like Robin Hood: taking a skillset that took time and intention to learn, which was mostly the privilege of leisure classes, until the Romantic revolution (for writers) and Klimt’s Vienna Secession (for artist’s) which challenged that art needed to take time and skill, and could just be about feeling and emotion, taking it away from the privileged few and giving it to everyone. For a lot of people, AI art is absolutely humanity-affirming because it always them to be newly creative and expressive in ways they could never hope to before.

That goodness doesn’t outweigh potential harm; of course all the AI tools should be remade without plagiarizing from existing copyrighted material, but that’s unlikely to happen because again, not illegal. Big players with influences will probably do what’s best for their own interests, like Elon Musk for example, who organized a whole movement to Slow Down these big language learning models so he could secretly catch up on building his own.

But even if all the AI’s are rebuilt with “Safe” material, they will still probably be just as powerful, which means they are still an absolute existential threat to artists and writers.

You absolutely own the copyright to your voice, and you should never steal, and other people should never steal from you, give your work away for free or make a profit from your words. But it’s also true that there will soon be tools that will allow people to do similar things, with much less effort, at a potentially higher quality.

Back to the movie…

the lesson movie original writing

Image Source: Variety

The author finally has the hired tutor edit his new book, and read the manuscript. But only if he reads his manuscript as well. Pretty ballsy, but the famous author humors him. The tutor gives him feedback: it could be a masterpiece, but the ending part three, it doesn’t belong, it feels like a different novel.

The author takes umbrage and cuts him down to size:

“You misunderstood, we’re not peers. You’re a proofreader. Also your novel is trash.”
“That’s years of work.”
“Really, how frustrating. I’ll never get back the house I spent reading it. Have you considered teaching? I regret to say you cannot write.”

Side note: this is why trading book reviews with other authors is never a good idea, most authors are sensitive or there’s an unequal bargain where both are expecting only positive comments.

After this fight… the editor figures out (fixing computer issues like tech support) that the dad actually STOLE his dead kid’s book (oops, spoilers – the family tragedy is a dead son, apparent suicide in the lake). Anyway, the truth is, the famous author stole his dead kid’s book which was almost done, but failed to write a satisfying ending.

So then the tutor goes scorched earth and deletes the entire book, both the son’s original and the famous author’s edits. Deletes everything. The author freaks out, until the tutor softly recites a passage… he has a photographic memory. Now he’s the hero-savior, and starts dictating or writing the whole thing from scratch, pen to paper.

They work together and bond. But the author checks the tutor’s work and throws doubt, “I didn’t write that! You’re changing my work!”

But by the end, the author is musing out loud and the tutor is editing on the fly; then it goes from author speaking and editor copying, to editor dictating and author repeating after him. A deliberate obfuscation of who is the real “author” of the work.

PS I love seeing the relationship between editors and authors, even if it’s always sensationalized as the rational, business brain (apollo) and the crazy, creative brain (dionysus) – here’s an article about the movie “Genius” which explores that theme.

“How do you want to be credited?”

“How about author?”

This demand for recognition, to be included and referenced, as the creator of the work (which at this point, was 2/3rds the son Felix’s and 1/3 the father’s, but then fixed, edited and rewritten by the tutor) creates the final conflict of the film.

The tutor accuses the father of his crime, that Felix actually wrote it.

There’s a big murderous fight, where it’s also revealed that the tutor slept with this man’s wife (minor side plot with no other function than to allow this final conflict to consume them both in violent rage). The author almost drowns the tutor in same pond Felix drowned in.

Aftermath: the author dies himself, shortly after, I forget the details honestly.

The wife had also secretly hired the tutor, to write the ending of Felix’s book, after learning the truth. So the wife is the final judge, and deems his the superior version. BUT she also deletes him:

“You were never here, you signed and NDA if you refuse we’ll tell them an obsessive wannabe fan killed him. If I have to I will make you the villain.” So he leaves with nothing. (But writes his own novel based on that whole tragic series of events, and it becomes successful.”

“You have talent you needed a story”

This line resonated with me the hardest. The movie is basically about who owns creative content and whether it’s allowed to borrow or steal; also about erasing creative endeavors and potentially even the worker or gig economy (the people being paid to build all the things rarely get to add their name and own their work, they’re just selling their time, talents and skills).

This super famous author was a cheat and a hack. He also by the way, told his son his book was terrible with tough criticism, probably because he was intimidated or threatened by his son’s talent, and that’s why his son killed himself.

But the tutor himself, he had all the talent and skill, but could not write a good book himself because he didn’t have a story, and a good story is more important than the style or quality of the writing in most cases. I’ve had this discussion in writing circles, because most of my friends either hire editors to clean things up or use ghost or co-writers who can write a fast draft, and there’s no consensus on what’s the *most* valuable part – the story ideas and editing or the words-on-page. Usually, one writer has to do nearly everything (or they pay through the nose for top skill and talent to help them complete their idea to a satisfactory level).

Readers don’t really care, mostly, about the process of creating the work, as long as the final product is entertaining. There’s an argument going around that humans will pay a premium for human-created work, but I doubt that’s true (and even if it was, scammers would just swear they didn’t use AI regardless of the truth).

Plagiarism is not new, AI just makes it easier

I actually disagreed with this headline as soon as I wrote it. It sounds good. And plagiarism has been around for awhile. A few years ago, when one of these crises popped up, Nora Roberts published a call to arms against shady publishing tactics:

“Enjoy it while it lasts, because it’s now my mission to turn over the rocks you hide under, then stomp you deep in the muck you breed in. I swear I’ll do whatever I can, use whatever resources, connections, clout, megaphone I have to out every damn one of you… the culture that fosters this ugly behavior has to be pulled out into the light and burned to cinders. Then we’re going to salt the freaking earth.”

Which is thrilling and compelling (though interestingly, people keep claiming Nora Roberts MUST be using ghostwriters because it’s “impossible” for someone to write so many books of good quality).

Plagiarism and stealing is bad.

But AI, by default, won’t actually reproduce phrases from another writer word-for-word, so now it’s easier than ever for these scammers to write good books of similar quality without just copy/pasting, and of course that’s what they’re going to do… but at that point, if they are using Legit Tools (soon to be inside of microsoft word, or any common grammar checker, or Canva or literally anywhere) are they still scammers? What differentiates someone passionately trying to improve their story and using AI tools to improve their craft to match their unique voice and vision, and someone cranking out popular books to make money?

Then it becomes, all about the intention and mindset alone, and not about the creative or productive process. One of my favorite books is actually “How I Became A Famous Novelist” – it’s about someone who writes essays for students, who decided to write a bestselling book, studied what was popular and did it. His book was championed, until he finally reveals that even though it was a good book, since he didn’t write it from a place of Pure Artistic Truth or whatever, it’s all fake and he’s a fraud.

Where’s the MORAL truth of this argument?

It’s tricky, because these things are all true:

  • stealing or copying is icky and wrong
  • but 100% of our creativity is trained on all the things we’ve seen in our life
  • and 99% of popular, commercial media uses heavy tropes and templates
  • 99% of writers aren’t success, 1% gets all the sales
  • THAT 1% of sales is probably not the creative, innovative, new stuff

I made all those numbers up but I doubt they’re far off. Here’s the point:

All authors want to be completely original – and that’s a great moral impulse for passionate creativity – but originality is not what sells, because it’s not what people want. People want, creativity studies and books have shown for decades, something that’s familiar and old but in a new fresh way.

Stealing any actual content and passing it off as your own work is always wrong.

Taking a decent story idea that’s been poorly done and trying to do it better, gets a little tricky. If you want to become a great writer, you should be reading: everybody says this. But why – unless the point is to learn and be influenced so that you can write closer to those successful writing styles?

Trying to be original just means not knowing what you’re doing: everything is hard, you’ll get stuck at all the hurdles, and almost all the time your makeshift manuscript will not function as a novel, it will be unreadable – too different from the standard of novel-writing to be fit for public conception. This is the state of most first time novels. The writing is poorly edited and the story conflict is just barely there under the surface. Everything overwritten, confusing and obtuse. It’s physically repulsive to read books like this (but also impossible to avoid – my first novel was garbage and I received the appropriate amount of scorn and skepticism).

If you intend to write a good book people enjoy, you can skip 90% of the time, frustration and insanely high bar of finishing a good book on purpose the first time (nobody does this! But it’s so much easier to do it on purpose instead of learning it all by yourself over several decades.) Anyway, that’s a moral decision, but like the movie says, most artists and authors start out like this: but then they try everything, see what sells, and do more of that… the public directs their course.

Alternatively, we can SEE the path they chose BECAUSE they succeeded. We have data, for how creative success works and what that looks like: we can measure all the successful people and see how they did things, and this is almost universally true – Van Gogh and Picasso, are two quick examples: they ruthlessly copied every popular style until it became something new through synthesis.

Going back to Stephen King… he wrote a short horror story at a time when Hollywood was hungry for horror thrillers. He wrote pulp fiction in an extremely hot market. Then he kept doing that, and became the bestselling horror writer ever probably.

We always hear (and appreciate) writing advice from writers who have made it, but they usually talk about their “process” and avoid mentioning stuff like how they wrote something super popular, on purpose or accident.

If they couldn’t sell it, if nobody wanted to buy it, they wouldn’t have gotten famous enough to remember. They weren’t popular because they were doing something new. They were popular because they intentionally tried to make “cutting edge” popular stuff for direct consumption. They were working artists. This is true for most famous writers and famous artists, but it’s not as inspirational as Romantic poetry like Rilke’s advice,

“Go into yourself. Find out the reason that commands you to write; see whether it has spread its roots into the very depths of your heart; confess to yourself whether you would have to die if you were forbidden to write.”

And if this is true for humans…

Now here is where it gets weird. For the past decade, the internet has mostly been a spamfest of people constantly rewriting and rephrasing their competitor’s articles to rank better and get all the traffic. Books are not the easiest business to go into, and it takes a lot of time or capital, probably both. It’s not easily accessible, because it’s fucking hard to write a good book. It’s like the hardest thing you can possibly do.

Not just the writing, but the editing and polishing, endless rounds of revision – it’s like getting educated on the first day of a new job with no training or introduction. Learning everything on the fly. BUT now we have AI writing tools that are insane. This is a very, very new thing, and it’s a big deal.

The problem is: AI writing is already better than us, and better than 99% of writers. I know that’s a hurtful and audacious claim, but it’s because AI can write in the style of professionally published books, and it’s really good. Talent used to take a ton of time and effort to acquire.

But talent just got erased. Low skill workers can now generate a ton of content that’s professionally written and copyedited. There are millions of great writers now. BUT…. remember the other quote: you need a story. Where are new stories going to come from? This is, right now at least, the “human” thing that will become more valuable, that machines can’t do. BUT creativity isn’t really about being original remember? ChatGPT can already do something like “I need a star wars meets stranger things YA dark fantasy” and it could crank thinks out. It’s smart enough, or soon will be, that if I say “write a bestselling YA book” it will analyze all the best sellers, read all the reviews, figure out exactly what people want, and actually write a commercial success.

This has actually happened a few times, though I forget the specific details I’ve wrote about it before on this blog… basically someone works as an editor or something, they want to reverse engineer success, either in YA fantasy or literary fiction… and they do it. Win big awards. Become bestsellers. It’s a legitimate path to success. And yet when you say someone “wrote popular fiction on purpose” it sounds spammy.

Right now, there’s a big writers’ strike going on. No new shoes. Everything halted. Celebrities can’t post about their shit.

Why? They know AI is about to replace them.

They have a point. They are threatened; so is everybody!

But here’s the thing: the big studios also know they have a point.

AI could replace them. So the big companies aren’t going to sign anything that says they can’t use AI and fire all their writers, because from a business standpoint, that’s the smartest thing to do. The writers are doing what’s best for them, the companies are as well.

*The writer’s strike is only partially about AI, and also about making sure they get paid…. but I think it’s a large issue, since due to the strike companies are probably focused on developing and using AI, firing employees and scaling back, something most companies will probably do soon in all industries).

Legal and moral issues

I sympathize with the writers and with anybody who is anti-AI right now. Everything I currently do to make money online might vanish this year and I’ll have to find some other way to make a living. I think this could be true for 50% of the workforce… and more once humanoid-robots become efficient, which I expect to happen within 2 years. I don’t know how society functions this way, once AI replaces all the jobs. Yes it’s terrifying, yes we should “stop it” somehow (I have no faith in humans that this is actually possible, but ethically yes it’s the right call.)

Legally, however… even though it’s dodgy as fuck for big models to train on content and then be able create new art and writing in the style of any writer – I don’t think it’s illegal. I don’t believe it will be “stopped” because all the big companies have already very quickly (surprisingly quickly) adapted. ChatGPT can do a better job than I can do, on most low-effort content (the very useful but very boring stuff people are actually searching for. ONLY the entrepreneurs will focus on this stuff, that’s why artists and writers get no traffic).

That means, for example, my little blog on creativity doesn’t get much traffic, because I can’t be bothered to write useful articles and prefer to blog on a manic binge and rant about stuff.

With chatGPT, I’ve been able to post more content and finish a lot more things that are “good ideas” but I don’t have time and effort for. It’s now worth it to do or create, because it costs less effort. It’s easier now. It takes less time, skill and talent. That means, it makes creativity and creative expression fun and easy, so that everyone can enjoy it – the opposite of the cliche (cringe) that real artists must suffer for their work.

That used to be kind of true, because to get good, you had to grind.*

*Quick Dark Souls reference: video games that are too hard to be enjoyable because they are brutal and difficult, and take a ton of time and effort investment, not suited for casual gamers.

Writing is hard, and most writers fail a LOT before figuring out what people will pay for (and then either doing that, or failing). So, completing a book is very satisfying because it’s so hard, and that’s why everybody wants to write a book but few people do.

Well, it just got easier, and not like 10% easier, like 1000%.

That means, everybody might start writing books. Most of them will be artists: they won’t make money or profit. But some people, the 1%, will intentionally use AI to create more of Their Best Work. Most of that will still be crap; people without the knowledge and expertise to pull it off. But a lot of that will be mediocre – and mediocre is actually pretty good, as long as it’s readable enough that people keep reading it, it can make money.

I feel like the dramatic close to this article would be something like, “is it OK to use AI tools that steal from other creatives, even if it’s not illegal.” And then the answer will be, only immoral people would choose to do that.

That means, if all the moral creatives don’t adapt and only entrepreneurs do, then the entrepreneurs will be making all of the content, assuming they can put out 10x the content. Ultimately, the market decides, and if the books are good enough to be enjoyed, then it is providing the experience paid for and satisfying clients.

I dislike the whole dichotomy of this discussion, and I’m a little too practical.

  • if everybody loves using a new tool to crank out their creative ideas and
  • these tools are built into everyday software to make life easier and
  • the content created achieves its goal – it satisfies the reader
  • the companies are not legally required to retrain their models

Then… resistance to AI tools as a moral choice may be an ineffectual way to positively influence the world culture with your brilliant ideas.

It’s not that I doubt it shouldn’t be used (given reflections about how AI will end society); it’s that I don’t doubt it absolutely will be used, because most people don’t form their entire identity around their personal struggle with creating art; as Originals trying to Do Something New.

Most people just do shit because they have to, or for fun, and most people will do the things they need or want to do, in the easiest and most efficient manner (less work, best quality). And not just because it’s easy to use, but because the output is objectionably better; it’s brilliant for writing personal statement essays for example.

The majority of society Doing Things will use AI because people have stuff they want or need to do, but a limited amount of time and energy to get things done. Ai will create more and more of the content and media we consume. AI will do most of the work to get us to see or buy that stuff, too. People will no longer be needed and will have to find other ways to make money.

We don’t know what that looks like yet.

Originality is not valuable. There are no new ideas. The movie positioned the author (yes we’re still talking about The Lesson!) as the villain and killed him at the end. The harsh critic who was a hack, tried to take what didn’t belong to him and was punished for it. It’s a moral movie, but it’s not a realistic one.

The book I mentioned earlier, How I Became a Famous Novelist” makes the same moral judgment: the protagonist who “faked” a bestselling book by crafting one deliberately even though he didn’t Believe In It is eventually chastised and even has a dark epiphany moment where he seems to recognize his a fake and a fraud.

Kurt Vonnegut’s Ice 9 as an AI metaphor

I’ll leave you with a final visual. Maybe you noticed I swear I few times in this post; something I don’t always do because it alienates my audience. But I’m an adult and I swear. I’m specifically NOT editing myself, because I’m now competing with AI bots and I need to prove my humanity, so I’m making grammar and spelling mistakes (on purpose I swear!) and casually swearing because I need BAD content to prove I’m an original creator and not a low-quality clone.

“People will always prefer the human” – people confidently say… but it’s not true. Most of my traffic for this site, after 10 years of blogging, is coming from long infodumps I made in chatGPT with no effort. People may enjoy my writing (doubtful) but they aren’t here for entertainment. People don’t have time to read random rants and musings from some guy on the internet. But they have time to search for answers and solutions. I need to have complex answers to complex situations, or I’ll be replaced. I’m doomed as an editor (probably) but I’ve got another year before I’m replaced as a graphic designer. I can’t compete with chatGPT on useful informational content, it’s better and smarter.

All I have is my humanity… screaming at the universe, thumbing my nose at the cruel fates in the sky, like the end of Ice 9 and a common theme in gothic literature – Manfred, Moby Dick, Paradise lost – the mythical warning about the hubris of technology and personal ability, that was first a great sin but became our new obsession (constant self-improvement without limit or end, a restless Faustian ambition and sense of drive).

Using technology to do something better and faster has always been presented as a moral failing, that robs the Divine Plan that steers humans towards Truth through pain and suffering (or at least, hard work). Knowing your limitations instead of Doing Hard Things is a religious prohibition that lost strength at the end of the 19th century (though the battles were fought!).

Ice9 is that very bad technologically idea that will probably end the world, and eventually it does, until the last human on earth can do nothing more than commit suicide by embracing it, giving up his humanity, and destroying his body, by turning himself into an eternal statue of refusal and opposition; AI is the evil that defines us as it destroys us.

And while it’s true the conversations about technology are pervasive in the history of literature, it’s also true that AI is unlike any technological advancement we have ever seen – even though, even the god of the forge Hephaestus had a literal army of robots to serve him – the ethics of Icarus’ fatal flight is a common lesson of dangers of striving for more than we are capable of handling.

Nuclear power, for example – allowed humanity to unleash a kind of destruction we were previously uncapable of; but has also sustained the growing energy needs of many countries for the last 50 years. AI, I hope, will allow lots of new benefits (like making healthcare cheaper and more effective for all) despite its potential harm… but that doesn’t mean it won’t be a brutal adjustment.

I’m swearing in this post so Google knows I’m a real live boy; I’m being performatively human for the robots; I’m writing this long, messy article, so Google knows I’m still just a messed up human posting crap to nobody. But here’s the thing… maybe Google doesn’t care either (trust me, they don’t). They’ll send traffic to whoever has the best, most useful content, judged by time on page and reader engagement…. unless someone pays them literally anything to show up before the free content.

They’re a business. They have to be just not annoying enough to use, while still fulfilling their core duty. Google is sending the traffic to my very informative and detailed, high search volume, AI written content, because it’s all about search demand (what people actually want) and not really about what I create or how I created it. What will probably happen, is everybody will make all their content twice as good. No more sloppy short crappy respun articles. Tons of insanely valuable and well-made content.

It will be harder and harder for quality to stand out, because quality is a style and type of writing that can be easily replicated. And raw/messy sweary content can be churned out too – if you want that. Humans are basically unessential to the commerce of life.

But this is an observation, not a warning.

One of my other favorite quotes is from a novel called Vampire Girl – I’m paraphrasing but it’s something like, “If you want something done right, give it to a lazy person.” I felt seen, because I don’t think laziness is a moral failing: it’s just a refusal to do things the way they’ve always been done, the hard way, instead of devising a solution or system, that gets the same result with less effort.

This has been my driving, life-time ambition. It’s why I made a novel outline template that makes it so much easier to right good novels… and a LOT of people have been very, very appreciative of my resources. But I guess AI skates too close to what I, as a human, can get away with (though – I did notice recently, I have been accused of claiming that a universal “hero’s journey” plot formula is necessary for all commercial books, because it’s a Western-centric approach to storytelling and even implies violence – as Joseph Campbell basically “stole” all the stories from foreign cultures, and reduced them to an essential framework to suit his purpose, which is an interesting and worthwhile note to include in the discussion.

I can never find the other quote I always refer to, I found it once when doing my PhD research, but it was some writing advice from some writer a few centuries ago who said something like, if you want to write a successful story, just take any of the classics and change the names and locations. Just keep the dramatic story events. It’s not bad advice; it works better than most writing advice. But the questions that arise, considering whether originality is the KEY to successful creative work or whether it’s unessential entirely because there are no new ideas… this was complex stuff before AI, and while maybe it was OK for humans to do this – and while maybe it’s OK for humans to try and make things easier and get more done – perhaps something on the scale of AI that allows this same thing to be done on a previously impossible level of quality and scale… yes it warrants a discussion, an investigation, a ban.

We are training or have trained our own replacements.

But we can also, all of us, do more of the things we want to do (or hate doing but have to do anyway) – and outsource or farmout the low effort, banal maintenance work; like hiring a full-time personal assistant, which means more potential, possibility, not to mention confidence and enthusiasm, as we can save more of our energy for the real work we Enjoy.

I don’t have any answers, and even fewer opinions, only a kind of Awe, because these are cataclysmic events, but I do think – I really do – that creatives should be paying attention, and that avoidance or refusal is not a beneficial strategy.

“Only gods come out of the machine. Ghosts stay in them.”

This last line (from the movie) is powerful but enigmatic. It probably refers to Deus Ex Machina an ancient latin phrase referring to when an impossible story solution is “resolved” forcibly in an unsatisfying way – through an unlikely, sudden event. In ancient Greek plays by Euripides and Aeschylus (one of whom probably invented the device) a literal god would float down on stage from a hidden crane, end the conflict, and prescribe a moral remedy.

Gods of the machine are integral devices and structures that are intended to be there.
Ghosts are the accidental things that are harder to pin down and unintended.

In our context, ghosts are the writers whose books have been included without permission inside large LLM’s – but also all the writers whose books you’ve ever read, who have influenced you as a person and a writer – the people you couldn’t give credit to if you tried because you may have no recollection or awareness that you were influenced by them.

I think the famous author in the movie, probably meant something like “ghosts (ghostwriters) like the tutor can only copy, remember everything, they can learn to write well but they follow the rules, they aren’t a Great Writer because skill or talent isn’t the essential thing. And Great Writers (like himself) don’t care about morals or ethics, they break rules to produce the Greatest Work and it’s all about the art.

No doubt you will have your own feelings, opinions and beliefs about AI writing tools in general, and they are valid. But I would consider, the gods and ghosts in your own writing.