I sat in a garage and invented the future because artists lead and hacks ask for a show of hands.

steve Jobs (2015)

boy finds love, pt. 1

My life changed the night my dad brought home our family’s first desktop PC. Inside the huge cardboard boxes were a CRT monitor and a Micron computer with a 1 GB hard drive and Windows 95. I was smitten with the machine from the moment the crackling screen was first turned on. I spent the rest of the night drawing in Microsoft Paint and playing Minesweeper though I did not even know the rules of the game.

Soon after, I picked up a book on HTML from my elementary school’s Scholastic book fair. A world of possibilities opened up before me. CSS was a new invention and a complete unknown to me; the <center> tag was a critical tool. <table>-based layouts ruled the web. Discovering the <marquee> tag was like Oppenheimer unleashing nuclear weapons, but good design and separation of concerns were irrelevant. I could create my own websites. The computer had become a creative tool for me.

Learning HTML set me on a lifelong path. I taught myself C++ in order to develop my own video games. In high school, I picked up Java. Originally, I went to college for biomedical engineering, but I would eventually realize I wanted a degree in computer science instead. The house I live in is paid for by the career I have built writing code.

Programming is still my passion. My nights and weekends continue to be spent learning and building with the computer. Why do I still love coding so much? Because it is literally fucking magic. You pick the right words and put them in the correct order then you can change the world. You can communicate across continents, solve impossible math problems, or build entire virtual lands.

However, that does not mean software development is easy. Fixing bugs or converting vague user requirements into functional features is usually a struggle. But that fight is a rewarding one.

No TV show understood this better than Halt and Catch Fire. At first, the show is a Mad Men-like ripoff period piece set in the 80s computer revolution. The show appears to be a fictionalized story similar to the rise of Apple. Joe MacMillan is the fill-in for Steve Jobs, an abrasive but bold visionary aiming to build the world’s greatest personal computer. The Steve Wozniak role is split between the older, hardware-focused Gordon Clark and the younger, software-genius Cameron Howe. The first season seems to be a standard story of creatives struggling to build something great until we get to the last episode. The project fails. The Apple Computer of this fictional world turns out to be Apple. In a tiny hotel room, Joe sees his dream brought to life by the hands of others in the form of the Macintosh.

In the subsequent seasons, the main characters try to build groundbreaking products—online gaming, social media, an internet service provider, antivirus software, and a web search engine. None of these businesses pan out. Halt and Catch Fire is fundamentally about failure.

In the series finale, Cameron and Donna, Gordon’s ex-wife and Cam’s former business partner, eat at a diner. Donna gets an idea, and we see the creative spark catch the tinder. The idea is never revealed nor does it matter. To be more accurate, Halt and Catch Fire is about the joy of creating regardless of whether the work is successful.

This is a valuable lesson to learn. Technology promises so much but rarely lives up to this dream. You have to love the creative process because the end result can be such a let down.

boy finds love, pt. 2

In school, math and science were always my favorite subjects. In comparison, English class seemed pointless. Decoding the symbolism in the short story of the week was an exercise in making shit up.

Worse yet was the grading system. Many teachers cared less about the content of your essay and more about its length. Like all other students, I would use every trick to stretch my page-and-a-half-long paper to two: restating sentences, replacing short words with polysyllabic ones, and tweaking the white space and font sizes. Under such a system, “good writing” (or at least what will get you an A) is separated from the concept of quality.

That changed in my junior year of high school. My AP English teacher, Mrs. McGown, was one of the greatest instructors in my life. Two aspects of her class made it work so well. The first was the difficulty. Mrs. McGown’s class was the toughest on my schedule. For the first time, trying my best and only earning a B was a genuine possibility, but that only forced me to rise to the challenge. Unsurprisingly, I am the sort of person that needs to be called out on my laziness from time to time.

The second key contributor was that Mrs. McGown made us read On Writing Well by William Zinsser during our first week. Zinsser’s work still motivates me today every time I pick it up, but at age 16, the book was revelatory. His words on simplicity are especially relevant to high schoolers:

Clutter is the disease of American writing. We are a society strangling in unnecessary words, circular constructions, pompous frills, and meaningless jargon…But the secret of good writing is to strip every sentence to its cleanest components. Every word that serves no function, every long word that could be a short word, every adverb that carries the same meaning that’s already in the verb, every passive construction that leaves the reader unsure of who is doing what—these are the thousand and one adulterants that weaken the strength of a sentence. And they usually occur in proportion to education and rank.

For the first time, I could see that good writing was an objective reality, not some subjective taste. I started taking pride in my work. I would submit a paper when I was done exploring the ideas rather than needlessly stretching it out. Even if my grade would be lower, I refused to pollute my writing with unnecessary complexity and repetition. However, this became a rare occurrence. I found that when I cared about my work I usually had far more to write than the assignment required.

My interest in nonfiction exploded. I devoured authors like Chuck Klosterman, Bill Simmons, David Foster Wallace, and Anthony Bourdain. I wanted to develop my own style that was an amalgamation of them all.

I started writing my own pop culture essays in my free time. That tradition continues today with me gleefully writing 4,000 words on Taylor Swift, Kareem Abdul-Jabbar, and Neon Genesis Evangelion. All those years in English class felt like training for the hobby I carry on today.

And fortunately, nothing would ever come around that would devalue the written word…

the four horsemen

I signed up for Facebook my sophomore year of high school. At the time, my primary purpose was to post obtuse and obnoxious status updates. I never thought it would be the downfall of western democracy.

I have not touched the site in well over a decade, but its influence remains. While MySpace may have been the first social media site with huge appeal, Facebook took that to a whole other level. Your grandma got onto Facebook. Even if the platform no longer has any appeal to Gen Z and Alpha, Facebook was the harbinger of every other social media site and their problems—Twitter, Instagram, Snapchat, and so many more. And Facebook’s parent company, Meta, is still one of the most odious technology companies around.

The biggest problem with social media is its echo chamber. All these apps peddle in extremism. Whether you agree or disagree with the material, the more extreme content and opinions drive more engagement and have far more reach than that picture you took of today’s lunch. Naturally, the market took this to its ultimate extreme where facts and truth no longer matter. Social media has incentivized the most outlandish bullshit and lies—filtered and Photoshopped Instagrams, politically extreme tweets, and fake news on Facebook.

But as the TV show Chernobyl warns, “What is the cost of lies? It’s not that we’ll mistake them for the truth. The real danger is that if we hear enough lies, then we no longer recognize the truth at all.” A world without Facebook is a world where Donald Trump was never president. Mark Zuckerberg’s legacy is lies.

I used to think Web 2.0 companies were cool. I don’t think that any more. And yet Facebook and social media are only the first horseman of the apocalypse.

The stupidest of these is cryptocurrency. Did the mysterious Satoshi Nakamoto know the horror he unleashed upon the world when he developed bitcoin? Certainly, the algorithms and design behind cryptocurrencies and the blockchain are interesting, but the utility is limited. Unless you are selling drugs or other illicit goods, the benefits of centralized financial authorities outweigh the positives of a distributed system. If I forget my password, I like that I can prove my identity to Chase and get back into my account. I prefer having fraud protection on my accounts rather than letting anyone hack my bitcoin wallet. Worst of all, most cryptocurrencies are limited in how frequent transactions can occur, making mass adoption impossible. And the cost for all these negatives is an insane amount of power wasted at a time when we are already starting to see the tipping point of climate change.

Some people may disagree with that previous paragraph, pushing back and claiming that cryptocurrency will eventually replace fiat currency. The truth though is that those individuals do not care about bitcoin being a viable medium of exchange. Cryptobros care because it is a speculative asset. Sure, we all wish we had a few bitcoin we mined back in the early 2010s, but getting rich off crypto is one of the most embarrassing, unethical ways of making your money. Next time just tell people you made your millions in Funko Pops and Pokémon cards.

The next horseman is TikTok. While the app may be another child of the first horseman, its negative impact earns TikTok its own spot. The fact that all that data goes to China (and possibly soon Oracle) is not even its biggest problem. Short form video is just pure algorithmic crack.

Whenever I open the app, I can feel myself getting sucked in. For the most part, scrolling on TikTok is not even enjoyable; it is just a way to burn time by the fistful. All that short form content does is destroy our attention spans. Watching a movie now seems like an insurmountable task in the age of brain rot. And destroying our attention spans has left us wide open to the fourth horseman.

you took everything from me

Arthur C. Clarke’s third law states, “Any sufficiently advanced technology is indistinguishable from magic.” This holds true even when the technical details are understood as Paul Bradley Carr argues:

The cut and restored rope is an ancient classic of magic, and there’s really only one way to do it. And yet the way [David] Williamson employed that method was so good—so mind-blowingly knee-shakingly good—that it (literally) made the hairs on the back of my neck stand up. 

And so it should be with technology. I remember—just as vividly as I remember David Williamson’s rope trick—the first time I bought a laptop with WiFi…I knew exactly what was going on: there was the WiFi card jutting from the PCMCIA port and I’d just spent ten minutes configuring the wireless hub. And yet…the experience of watching full-screen video on my laptop without wires was as near to pure magic as watching that piece of rope mend itself two feet from my eyes.

Seeing ChatGPT in action for the first time certainly fell into this magical category. Watching the words slowly appear on screen ushered in a new age where perhaps mankind had created an intelligence smarter than ourselves. Unlike the entirety of Web 3.0, large language models were genuinely impressive.

Of course, the shine quickly wore off. Anyone with enough experience using AI has seen the hallucinations and boring tone that plague LLM content. Worst of all though has been watching every company pour billions of dollars into AI projects that they will not shut up about. We do not need generative AI to come up with new Oreo flavors.

As a software developer, AI gets mentioned at virtually every meeting and conference I attend. Sitting through yet another AI demo is infuriating:

“Isn’t it impressive that GitHub Copilot generated 70% of the functionality we asked for with only 10,000 lines of changes? (Just ignore those handful of major bugs and security vulnerabilities.)”

No, it is not.

Nonetheless, the writing is on the wall: Hundreds of thousands may become a developers who do not code. Particularly at non-tech companies, developers often wear multiple hats: full stack developer plus SRE plus QA plus system/network/application admin plus data analyst/engineer/scientist plus project manager. The rise of AI tools is only going to increase this problem and raise business expectations. Eventually, the technical debt will grow too large, and the product will collapse like a house of cards. Executives will ask why their companies cannot produce working software any longer, but they will not be at a competitive disadvantage. Nobody else will be able to either.

While AI’s impact on code will be immense, its effects on the regular written word may be even worse. Even before the advent of generative AI, search engines had already been taken over by SEO blog spam written by humans. AI is making this a hundred times worse.

Good writing will still exist in the age of AI slop, but you may never find it when the ratio of good-to-bad writing becomes so skewed. This is more than just a numbers game though. Discoverability is dead. AI overviews for search results on Google and Bing mean far fewer clicks on links. If you are reading this essay, then it is unlikely you got here from a search engine. The promise of the internet and independent digital publishing is gone.

As a programmer and writer, artificial intelligence has had a significant impact on my mediums of choice. As Bender says, “This is the worst kind of discrimination, the kind against me.” AI is ruining my life, but the true cost is far deeper than jobs and hobbies. This is about the soul of art itself.

i don’t even know who you are

David Foster Wallace once warned:

The technology is just gonna get better and better and better and better. And it’s gonna get easier and easier, and more and more convenient, and more and more pleasurable, to be alone with images on a screen, given to us by people who do not love us but want our money. Which is all right. In low doses, right? But if that’s the basic main staple of your diet, you’re gonna die. In a meaningful way, you’re going to die.

DFW’s words were primarily about advertising and cheap media. Even he could not predict how bad it would get with the arrival of AI girlfriends.

In the universe of Dune, the Butlerian Jihad occurs between humans and sentient machines. After the human’s victory, thinking machines are outlawed. Maybe those guys were on to something. At least they did not have to deal with AI slop.

It is important to note though that LLMs are not truly thinking. They are probability machines that are really good at guessing what next word they should pick when responding to questions. But that does not mean they are not horrifying. Soulless things always are. Film Crit Hulk’s essay entitled “A.I.: The Apocalypse of Intent” cover this well, but the most poetic description of AI inside that piece comes from @joles.bsky.social:

there is a monster in the forest and it speaks with a thousand voices. it will answer any question you pose it, it will offer insight to any idea. it will help you, it will thank you. it will never bid you leave. it will even tell you of the darkest arts, if you know precisely how to ask.

it feels no joy and no sorrow, it knows no right and no wrong. it knows not truth from lie, though it speaks them all the same.

it offers its services freely to any passerby, and many will tell you they find great value in its conversation. ‘you simply must visit the monster—i always just ask the monster.’

there are those who know these forests well; they will tell you that freely offered doesn’t mean it has no price.

for when the next traveler passes by, the monster speaks with a thousand and one voices. and when you dream you see the monster; the monster wears your face.

In 1983, Electronic Arts published a bold ad asking, “Can a computer make you cry?” As an aspiring game developer, I used to emphatically answer in the affirmative. Nowadays I’m not so sure. When I searched for that EA advertisement on Google, the AI overview told me, “Yes, a computer can make a person cry.” I am sure those damned machines are trying to trick me.

There is this TikTok account I stumbled onto called irlfinalfantasy. In these short-form videos, real-life versions of the cast of Final Fantasy VII are shot in a variety of 90s-aesthetic high school situations. Watching them gives me a sense of nostalgia for a world that never existed. They strike an emotional chord on multiple levels: my love of video games and Final Fantasy, a simpler time when the dream of the 90s was still alive, and nights from high school long past. Of course, these videos are AI-generated.

The work is genuinely impressive. Maybe it does say something that the “creator”/AI-prompter was able to generate something that connected with me. But the AI itself does not have a clue about that. And ceding our artistic intent to an unfeeling robot is the same as casting those feelings into a black hole.

So, no. A human can make you cry. A memory can make you cry. But silicon by itself cannot. Resist the urge to use AI in your creative pursuits. Computers don’t have the memories of those late nights playing games and driving around a suburb you were ready to escape from. They don’t have your recollections of heartbreak and love. And it will never have those precious moments lost in time. The difference between you and the machine is that you’re going to carry that weight.