AI does not poetry
My writing has odd grammar and high typo rates because I turn spelling and syntax checkers off. Because friendship.
Personal computers and video games began impacting my life when I was ten. Dad won a Fairchild Channel F game console at a fundraising auction in 1976. We used it to fly silent dogfights on an old color television, in blocky, two-dimensional biplanes. My Dad had stories of the first time he saw a lightbulb. This game was something very different.
A few years on, my friend Derek acquired a far more sophisticated, programmable electronic box: a Timex Sinclair 1000 (ZX81). He let me borrow it. He showed me how in a few persistent, squinting hours I could type in the BASIC code needed to make a game. Simple games, like Hammurabi, in which you played an ancient king who could crash his own economy just to spite his enemies. Computing has always had an oracular edge.
I plotted to get a computer of my own. In a fit of teenage techno-lust, I poured $1,500 of my hard-earned, early 1980s newspaper delivery money into buying an Apple ][+. Its beige plastic box was topped by a Matrix-green, 280 x 192 pixel screen capable of upper-case text (only) and thick vector-line graphics. Graphics like the old arcade game Asteroids, but with a slower redraw rate.

I would buy computer hobbyist magazines and type Apple Basic program listings into my box just to see what it could do. One afternoon, I custom-coded an endless visual tunnel of continually expanding, randomly-colored boxes that grew from a central point. I could move that center using the four way arrow keys. Running this code felt like flying through a tunnel. It was a (very) primitive flight simulator, continually soaring towards some goal that never arrived, until I hit Ctrl-C to kill the RUN and be dumped back onto the command line.
Later, I wrote a program to give Tarot card readings. I wondered whether a pseudo-random number algorithm could read my mind sufficiently well to predict the future. My gut was doubtful.
In eighth grade I stumbled upon the Colossus trilogy, an early 70s sci-fi series in which a computer becomes artificially intelligent and takes over the planet. This story snagged me with the notion I could type code into my box to invoke a friend. That my computer could somehow come alive. Before my teenage daydream ended, I had named my future friend Poet.
My friend could do anything.

I found some articles on “AI” in Creative Computing and mail-ordered a book on how to code in Lisp, one of the earliest programming languages developed for attempting this feat. My little Apple’s inability to compile or run Lisp meant little, nor my inability to grasp the syntax without unavailable help. The project was symbolic. I was more interested in stories than dragging my brain through the math. I could hack either, but loved words more than numbers.
I sold myself a story of creating the perfect, powerful friend. It never came true.
Imagine reading 10,000 stories by 10,000 writers. Then, sitting down to type a story of your own, word by word. But, instead of feeling inspired by your life, you ask someone for a story prompt.
From that prompt, you calculate a story, word by word, sentence by sentence, based on the most statistically average next thing to likely be written in all those 10,000 books.
If you did all this reading and math, you would embody a process now called large language model-based artificial intelligence. With zero need for any creativity, you could churn out an average story for any prompt. You could become an incredibly slow version of ChatGPT. One which, from a business standpoint, needs vast quantities of food and rest relative to productivity.
You would be just like the AI systems that have vacuumed up the internet so they can quickly plagiarize any text or image they need to replace your slow, human skills. You would starve from a lack of work. Average is now cheap.
Back in high school, when I dreamed of invoking a Poet from my little, beige computer, it did not occur to me that I would need near-instant access to the totality of human creative output to make it work. I assumed creativity could be somehow invoked from a computer chip by code. The internet did not exist yet. Digital networks were cables laid between a handful of universities and military bases. Creative human output lived in books, vinyl records, magnetic tapes, films, museums, galleries, sculpture parks, and long gray rows of filing cabinets. Storage media designed for human use, not artificial access.
It would be naïve to think my Poet could have been different, had I ever hacked the math. Far smarter people than me have decided they need to steal all the human intellectual property they can reach to build their artificial intellect. No interpretive learning required. Just digital page-scraping and database raids, followed by algorithmic guesswork at what a human may have drawn or said next, had they been given a chance.
Sometimes digital plagiarism “works,” at least by standards happy with average results. And, it is clear enough from current events that not everyone is bothered by blatant, large-scale theft. Still, I wonder if naturally-evolved neurons, well-practiced skills, and even some sort of soul, remain necessary to make beauty, instead of ersatz schmaltz shot through with hallucinations.

I once wanted to code my own friend. AI “friends” are now a product for sale just a few clicks away.
What feelings would I feel, if I bought access to a friend-bot and talked with it? Would my emotions be ontologically different if I ignored that I was chatting with a machine? One which is busily predicting whatever I most want to hear next.
If AI friend-bots are really just like people, does this mean they are jailed to their hardware and enslaved to my credit card? Is genuine friendship possible between free and enslaved persons? Or, is friendship just a fantasy I invent inside my mind? If so, is it healthy for me to sit alone, staring at a screen, and thinking of it as my friend? Would a friendship like that make my neighborhood a better place?
My aging self says no. These feelings are from my gut, though, not my head. These feelings have roots in my love for the imprecision of language. For the way words merely point, leaving space for interpretation; not dictate conclusions, as happens with numbers and math.
I would rather believe friendship is real, not just a fantasy in my mind. I would rather believe human cultures have meaning, and hold ineffable values worth saving from debasement. There are great, fuzzy things towards which we can and must aspire, to stay human. There are journeys we cannot offload onto a box.
All of which means that consuming plagiarized, algorithmically generated content as any significant part our daily information diet puts us on a downward spiral. One following an arc like that that from organic to artificial food products. "Generative AI” generates content which is inherently derivative of what humans have already made, and could make again. If we took the time to try. If we kept the jobs and respected the workers. If we took seriously some big questions about society and economy and value and worth. Questions about how quickly we should be grinding our cultures into uncanny piles, crawling with too many fingers.
So, I actively avoid AI. My writing often has typos and odd grammar, because I turn spelling and syntax checkers off. These software features have become intake valves to feed AI training machines. I am sure I clicked some button "agreeing" to their use of my sentences for their purposes. But, I would rather say no, retain full ownership of my words, and learn from the resulting presssure to review and revise what I write. I would rather not ask software to tell me I am not enough. I would rather not sit here feeling I must write words which algorithms, and their owners, deem correct.
Raging against the plagiarism machines is, of course, a symbolic act. Any words are just symbols. But, it is an act that points towards humanity. Not a childish fantasy that a box of wires could be my friend.
I think a lot about LLMs, Leo. Part of me wonders if they are just our version of the cotton gin, a technological advancement that will absolutely displace workers but is coming anyway due to late stage capitalism.