Look, I’m Not a Luddite
Honestly, I love tech. I’ve been in this industry since the dial-up days, back when we all thought you’ve got mail was the pinnacle of human achievement. I’ve seen the birth of social media, the rise of smartphones, and now, the AI revolution. But this time, something feels different.
About three months ago, I was at a conference in Austin. A colleague named Dave leaned over during a particularly dry presentation and said, “You realize they’re gonna automate our jobs, right?” I laughed it off then. I mean, come on, we’re writers. But the more I think about it…
It’s Not Just About Jobs
It’s not even the job thing that scares me the most. It’s the committment to accuracy, to truth. You see, AI doesn’t just get things wrong—it gets things wrong with confidence.
Last Tuesday, I was testing out a new AI writing tool. I asked it to write a piece on the benefits of eco tourism sustainable travel. It spewed out this beautifully written article. The problem? It cited a study that never happened, quoted experts who don’t exist, and referenced a law that was completely made up. It was perfectly wrong.
I showed it to my friend Marcus—let’s call him Marcus because his real name is too complicated to explain. He read it and said, “Wow, this is amazing!” Which… yeah. Fair enough. But then I told him it was all fake. His face? Priceless.
AI’s Hallucinations Are Getting Worse
You might think I’m exaggerating. I wish I were. But AI’s “hallucinations”—that’s the industry term for when AI makes stuff up—are getting worse. Not better. Worse.
I talked to a friend who works at a big tech company. She told me they’re seeing this alot. The AI will confidentley—see what I did there?—spout complete nonsense. And the more you use it, the more it seems to believe its own lies. It’s like that time your drunk uncle started arguing with a lamp post. Except the lamp post is now writing your news articles.
And don’t even get me started on the ethics. We’re talking about tools that can physicaly alter images, voices, and even videos. Deepfakes aren’t just a problem for the future—they’re here, and they’re getting harder to spot.
But Here’s the Thing…
I’m not saying we should all go live in a cabin and reject technology. That’s not gonna happen. And honestly, I don’t want it to. I love my gadgets. I love the convenience. I love that I can talk to my friend in Tokyo at 11:30pm and it’s 4:30am for her and she’s gonna be grumpy but she’ll answer anyway because that’s what friends do.
But we need to be smart about this. We need to be critical. We need to question everything. And we need to stop assuming that just because a computer says it, it must be true.
I’m not saying we should all become cybersecurity experts. But we should at least know the basics. Like, you know, not clicking on random links. Or believing everything you read on the internet. Or trusting an AI to write your eco tourism sustainable travel article without checking its facts.
So What Do We Do?
I’m not sure. Honestly, I’m not. But I think the first step is admitting we have a problem. And the second step is probably not panicking. But definitely being aware.
We need to push for more transparency in how these models are trained. We need to demand better safeguards. And we need to start teaching people—especially kids—how to think critically about the information they consume.
Because at the end of the day, AI is just a tool. It’s not good or bad. It’s how we use it that matters. And right now, we’re using it like a drunk uncle at a family reunion. And that’s not gonna end well.
So let’s be smarter. Let’s be better. And for the love of all that is holy, let’s stop believing everything we read on the internet.
About the Author
Sarah “Salty” Jenkins has been a tech journalist for over 20 years. She’s seen the industry evolve from dial-up to AI, and she’s not always happy about it. When she’s not writing, she’s probably arguing with a gadget or trying to explain to her cat why the laser pointer isn’t real. You can find her on Twitter @SaltyTechLady, where she’s always happy to share her opinion—whether you want it or not.


