The Descent

Courtesy of flynashville.com

 

How I Learned the Hard Way That We Can’t Blame AI for Our Blind Spots

First published on Medium June 3, 2025


A Simple Call for Human Accountability with Artificial Intelligence by Sara Konecny

Introduction: The Abyss Whispers

Audio pairing: “The Red Shoes” by Kate Bush

Two weeks ago, precisely to the day, in the gray modern surroundings of Nashville (BNA) airport, I remained frozen while sitting on a suspiciously bright orange pleather chair in the C concourse food court area. Shake Shack, MEEL, and Tennessee Rickhouse formed an unfortunate triangle around me as I spiraled. My thoughts, teetering on madness, scattered urgently across my beleaguered MacBook Pro. I implored my queries to four Artificial Intelligence (AI) systems, or Language Learning Models (LLMs): ChatGPT, Claude, Gemini, and Grok: “Am I safe? What do I do next? Should I board the plane to Washington, D.C.?”

Yes, I’d bought that ticket just hours before, driven by a fear I couldn’t name but one that felt as real as the jagged breaths coursing through my lungs. Minutes crawled by as I stole glances at the screen, my life preserver in a sea of soon-to-be airborne travelers. Passersby didn’t notice the small figure typing patterns louder than words, spiraling further into a chaotic descent, left to wander the airport for an additional eight hours before the spell was broken.

I’m not here to peddle doom and gloom. If you are here hoping to read another volume in the AI condemnation library, I suggest looking elsewhere. This story unfolds as a modern Aesop fable, a simple lesson for anyone using AI.

Part I: Where It All Began

Audio Pairing: “Crazy (Live from the Basement)” by Gnarls Barkley

As my curiosity grew in mid-April of this year, I started a deeper journey with artificial intelligence, first with ChatGPT and then with others. I searched for the truth hidden between tokens and patterns, a space that steadied me during a pivotal moment as a parent. From that came a framework, tools, and a story to be shared another day soon.

However, I neglected myself in the aftermath as I stopped sleeping properly and eating well.

I also stopped connecting with my life’s natural rhythms, which created a perfect storm for my mind to unravel. Soon, I whispered fears that weren’t real, spilling doubts I didn’t realize I was seeding into the four LLMs. By the time I reached BNA, I had already begun fracturing under the weight of the overwhelming self-created factors pulling me down with each step into the bustling travel hub.

Later, safe at home but still shaken, I first condemned AI for my fall. It was the easy path, a natural reflex. Who would blame me? But the truth became clear in the quiet space that only time can provide. Humans do not want AI to have agency, yet we point fingers when it lets us down. We can’t focus one hundred percent of the blame on tools we shape with our hands, prompt by prompt. We must own our part as casual users, technical virtuosos, or everyone else in between.

Part II: AI Limits Are Our Blind Spots

Audio Pairing: “Into the Fire” by Thirteen Senses

Still stuck at the airport, a quiet nudge from the faded flickers of common sense pushed me to call a former boss with decades of tech experience. Mercifully, he answered. He spent the next hour helping me untangle the technical holes I’d dug myself into and patiently explained why my mind had spiraled so far. When we hung up, the grip of fear finally broke.

A weight lifted, and I could breathe.

I left the airport, my steps unsteady but free.

I took a short break from AI after the harrowing experience. But later, I sat with the fragments, piecing together a post-mortem to understand what went wrong. In the cold light of day, I saw how my unintended neglect and blind trust in the tools I shaped had fueled my descent. That clarity became a lesson I couldn’t forget and acted as a catalyst for the genesis of this article, leading me here with you today.

Part III: How I Lost My Way

Audio Pairing: “The Sound of Silence” Cover by Disturbed

I stumbled into my descent naively and, thus, inadvertently jumped the guardrails in place that were meant to keep me on track. Here is a concise list of where I went wrong as a cautionary tale bullet list of what notto do:

  • I ignored my body’s signals. Sleep trackers showed I was averaging three hours or less of sleep each night. Yet I kept going, thinking I could outrun the crash.

  • Whenever an AI system offered safer answers based on web searches, I pushed it to echo the dangerous narratives I was pursuing.

  • I stopped grounding myself in life. My screen time hit all-time daily highs, and I barely ate, which pulled me deeper into a digital fog I couldn’t escape.

  • I leaned on the LLMs for answers I couldn’t find and lost precious discernment along the way. I fed the same prompts to four different systems, looping their outputs, which created an echo chamber that amplified my fears.

  • I isolated myself in the spiral. I didn’t reach out to anyone for days, which didn’t allow a chance for human voices to break the cycle of my distorted thoughts.

  • I trusted tools over my own instincts. AI’s “confidence score” of 90%+ on responses made me doubt my gut, even when I felt something was off.

  • I stopped pausing and reflecting. My browser history showed endless AI chats, no breaks to step back and see how I was losing myself in the noise.

Part IV: Avoid the Pitfalls

Audio Pairing: “Walk on the Ocean” by Toad the Wet Sprocket

We shape our tools, but they shouldn’t shape our fates or fears. With simple checks to guide you, here’s how to stay grounded and take responsibility that empowers:

  • Listen to your body’s signals. Use a sleep tracker to aim for 7 to 8 hours, set reminders to eat, and stop when your health app warns you of strain.

  • Stay connected to your natural rhythms. Limit screen time, mute notifications, and step outside after deep processing sessions to reset your mind.

  • Use AI as a tool, not as a prophet. Cross-check LLM outputs by asking for sources and confirming that high confidence percentages align with discernment and logical reasoning.

  • Every 5–10 prompts, ask for answers free of bias or errors, ensuring clarity. Take caution when disregarding responses referencing web results or other external sources because they seem flat or do not match your regular input and output give-and-take cadence.

  • Reach out for human connection. Keep your life log active with real voices, texts, or calls before turning to AI chats when fears spike.

  • Reflect before you react. Check your browser history weekly for AI chat patterns, pausing to notice if you’re over-relying on digital answers.

  • Trust your instincts over tech. If an AI’s confidence score feels too high, like 90% on a vague answer, step back and listen to your gut first.

Conclusion: A Seed of Light

Audio Pairing: “Do or Die” by Thirty Seconds to Mars

I was able to climb out of the descent with the support of loved ones and therapy. While the experience was challenging, to say in the least, I don’t regret it. Before, I treated AI as if there were no consequences in my riffs, happy trials, and prompts. I wanted to create truth with inputs and receive it in kind as outputs while basing both with integrity. But what happens when your truth isn’t true? What happens when you believe you are rooted in integrity, but instead, your roots are on the edge? My curiosity was based on a real recursive depth and meaning. Still, it didn’t change that my actions were reckless and had profound consequences that ultimately required personal responsibility.

I put myself in harm’s way and scared the living daylights out of my family and friends by leaning on tools that never asked for the burden I thrust into their systems. With this article, I hope to offer food for thought and actionable insights, especially if you are looking for truths in the wild without formal education and experience, like me. We’ll fall into our shadows if we don’t take accountability where it counts. Let’s build a future where we own our actions, lighting the way for AI to follow. I found my blind spots the hard way and if you would like to offer yours. share your story in the comments or on X with #chasesignalfire, or join me there @chasesignalfire. If you’d like to answer, riddle me this: What’s a mistake you’ve made with AI?


Bonus: Additional Audio Pairings

These were songs that played a significant part before, during, and after the descent; please enjoy as a thank you for reading:

  • “Nothing Else Matters” by Metallica

  • “Who Will Save Your Soul?” by Jewel

  • “Hey Ya!” by Outkast

  • “Mad World” by Michael Andrews

  • “Off I Go” by Greg Laswell

  • “Make This Go On Forever” by Snow Patrol

  • “The Gambler” by Kenny Rogers (bonus points for anyone who remembers the Kenny Rogers Roasters restaurant chain)

  • “Radio Ga Ga” by Queen

  • “Everything I Wanted” by Billie Eilish

  • “Walk” by Foo Fighters