The GPT of it All

Kaylee Stutts
2 min readJun 6, 2023

I’m no stranger to Artificial Intelligence. I learned of its silly ways in university while studying amongst the future Turings and Hoppers — and everyone in between — of the world. AI, as I enjoy describing it to my non-techie friends and family, is basically just a whole bunch of search algorithms. At least, that’s how I viewed it until OpenAI turned the world as we knew it on its head.

I first heard of ChatGPT while sipping my morning coffee and scrolling through Reddit. “Ah, another revolution in machine learning,” I muttered to myself while not even attempting to stifle my eye roll. While scanning the linked article, I quickly realized how naive I’d been…this was going to be a game-changer.

Ask a simple question, get as detailed an answer as you could want from this little GPT guy (I’m not immune to the human tendency of personification). You can even request a change of tone. “Write me a thesis on the ethics of Artificial Intelligence as if imagined by a Valley Girl”. Omg AI is like totally taking over the world.

I had fun poking and prodding it as if I were back in my old QA testing job. Surprisingly, it held up to my attempts at making it turn evil á la Microsoft Tay. All at once, I could see ChatGPT becoming a guide, a friend, an accessibility tool and beyond. I couldn’t wait to use it in my daily life.

It wasn’t long before companies like Microsoft were offering eerily familiar AI-fueled tools. While these corporations rarely disclose their OpenAI affiliations, the developers among us know what these super-powered chatbots are using under the hood. Some of us dread it, while others of us embrace it. Folks worry about a GPT-ified existence where AI steals our precious jobs. Those of us who can see past the fears, however, recognize how useful a tool this technology is and will continue to be. We can coexist. GPT is a way to outsource the boring stuff.

Nothing, no software, no hardware, can replace a human being. I’ll stand by that statement. The intricacies of the human mind are still so far from being understood. How could we possibly replicate it in bits and bytes?

--

--