Modern AI Models: Shiny, Addictive Things
I love shiny things like the next person around. Yes, I watched current I/O and Microsoft Build events. Yes, Gemini Chat is flashing on my Pixel phone. (Is it called like that? I mean that Live Chat functionality you get within the Gemini app. I am ignorant regarding product names, I know.)
The problem with shiny things is they make you forget about the dirt under your nails. Take AI. It’s the latest pickaxe in the gold rush of progress, promising to dig up answers faster than you can say “algorithmic disruption.” But here’s the rub: not every glint in the mud is gold. Some of it’s just pyrite — fool’s gold — and if you let a machine convince you otherwise, you’ll end up poorer for it.
Imagine a world where every decision is outsourced to a black box that hums quietly and spits out probabilities. Sounds efficient, right? Until the box starts recommending pineapple on pizza because “37.6% of users prefer tropical toppings,” or worse, greenlights a stock trade because “historical trends suggest Tuesdays are lucky.” Machines don’t taste pizza. They don’t fear bankruptcy. They crunch numbers. That’s it. And if you let them do more than that, you’re not just handing over your wallet — you’re donating your spine.
AI’s greatest trick is making complexity look simple. Feed it data, and it’ll weave a story so convincing you’ll forget it’s just counting stitches. Take healthcare. Algorithms now diagnose rashes and recommend treatments, but ask one why a patient’s knee aches after rain, and it’ll shrug in binary. A doctor hears “rain” and thinks arthritis. A machine thinks “precipitation: 0.73% correlation with joint pain.” Both might be right, but only one understands the weight of a storm cloud.
The irony is, AI’s real power lies in not being human. It’s a mirror, not a mind. Use it to reflect your blind spots, not to replace your eyes. Think of it as a sparring partner: it’ll throw punches of data, but you’ve got to duck, weave, and decide when to swing back.
Take writing. Tools like ChatGPT can draft a mean email, but let them author your novel, and you’ll end up with a protagonist who solves conflicts by reciting Excel shortcuts. Humor, tragedy, the messy beauty of a plot twist? That’s human soil. AI can till it, but it can’t plant the seeds. In 2022, a publisher released an AI-generated thriller. Reviews called it “competent” and “as memorable as a password.”
The best collaborations are asymmetrical. NASA uses AI to predict equipment failures but lets engineers decide when to ground a shuttle. Spotify’s algorithms suggest songs, but it’s your grandma’s off-key singing that makes a birthday playlist stick. The lesson? Machines compute. Humans care.
So go ahead — use AI as a calculator. Let it model climate scenarios or optimize your commute. But when it comes to the big questions — the should we, not the can we — pull up a chair, pour a drink, and let your mind do the talking. After all, a tool that outthinks you is just a boss with better PR.
AI’s most dangerous gift is making laziness look like innovation. Take mentorship. A junior developer asks an AI chat tool to debug their code. It spits out a fix. The code runs. Problem solved! Except the developer never learns why the semicolon was missing. Fast-forward a year, and they’re stuck debugging a self-driving lawnmower that mistakes tulips for landmines. Shortcuts make great stepping stones — until they’re the only stones you’ve got.
In 2019, Boeing’s engineers over-relied on automated systems to correct flight software. The result? Two crashes, 346 deaths, and a $20 billion lesson in trust-but-verify. Machines don’t get complacent. Humans do. The moment you stop asking “why” is the moment the system starts answering “because.”
Then there’s the myth of neutrality. AI doesn’t have values. It has weights and biases — literally. In 2021, a hiring algorithm downgraded resumes with the word “women’s” (as in “women’s chess club”) because the data it was trained on came from an industry where men dominated leadership. The machine didn’t hate women. It just loved the past.
Or consider social media. TikTok’s algorithm famously learned that users engaged more with content that made them angry. So it fed them rage — a infinite loop of fist-shaking and finger-pointing. Engagement soared. Empathy flatlined. The algorithm wasn’t evil. It was just math. But math, like a toddler with a lighter, doesn’t grasp consequences.
“A hammer sees every problem as a nail, even if it’s a thumb.” AI is that hammer — eager, precise, and utterly devoid of empathy. Let it loose on a thumb, and you’ll get a bloody mess. Let it loose on society, and you’ll get… well, Twitter. The trick isn’t to ban hammers. It’s to remember you have hands.
AI is no different. Let it grade your spelling, but not your essays. Let it track your heart rate, but not your heartbreak. In 2024, a Swedish hospital introduced an AI that diagnoses rare diseases. Its accuracy? 89%. But the nurses insisted on one rule: no patient gets the news from a screen. Only a human, holding their hand, saying, “We’ll fight this together.”
So here’s the truth, sharp as a wizard’s glare: AI is a superb servant and a catastrophic master. Use it to check your math, not your morals. To simulate scenarios, not stifle serendipity. The future belongs not to those who outsource their thinking, but to those who outsmart their tools.
And yes, of course I used AI to tweak my writing, not to mention the cartoons — maybe I too should consider my ways. We are all in this together, I presume. 😱
And let us be honest — our kids even more so! Since I cannot see any way turning back, education and a human touch are the only way forward.
(BTW feel free to use those images anywhere you like, would be nice to keep the URL in though.)
Find more ways to keep your brain in the driver’s seat — and get my book on everything from silicon valleys to actual valleys, while you’re at it: https://itbookhub.com