The risks of trusting LLMs & GPTs for ALL of your knowledge needs

JP Holecka
Simply Product
Published in
3 min readFeb 15, 2024

Ah, the ever-spinning wheel of technology and its implications, right? It’s like we’re on this perpetual tech merry-go-round, sometimes thrilling, sometimes dizzying.

…hold our horses before asking them for the recipe to life’s big questions; here’s a thought or two spun in the fabric of our discussions.

So, diving into the vast ocean of Large Language Models (LLMs) and why we might want to hold our horses before asking them for the recipe to life’s big questions, here’s a thought or two spun in the fabric of our discussions.

First off, it’s like LLMs are these massive digital encyclopedias, but without the soul. Imagine a library where the books can talk but don’t really “get” what they’re saying.

The mirage of omniscience

First off, it’s like LLMs are these massive digital encyclopedias, but without the soul. Imagine a library where the books can talk but don’t really “get” what they’re saying. That’s our LLMs for you. They’re like that friend who’s great at trivia nights, but maybe not the one you’d ask for life advice. Sure, they’re pulling from a vast array of sources, but it’s like they’re reading from a script without grasping the essence. And when explaining how they got from A to B? Forget about it. It’s like asking your GPS to philosophize about the journey.

LLMs are like race cars. Fast and powerful, but without a skilled driver and some strict rules, things could go haywire.

The invisible safety net

Now, onto the governance. Picture this: LLMs are like race cars. Fast and powerful, but without a skilled driver and some strict rules, things could go haywire. That’s where governance comes in, acting as both the skilled driver and the rulebook. It’s about ensuring this powerful tech tool doesn’t run wild, spewing answers like a broken slot machine. We’re talking about a system that ensures these digital know-it-alls are accurate and transparent about how they arrived at their conclusions, like a magician revealing his tricks, but less fun.

They’re saying, “Hold on, let me check,” before diving into the vast internet sea to fish out more current, relevant info.

RAG and Vector Databases are LLMs’ lifeline for the unknown queries

Enter Retrieval Augmented Generation (RAG), the tech world’s latest darling. Think of it as giving LLMs its own Google to search stuff when they’re stumped. They’re saying, “Hold on, let me check,” before diving into the vast internet sea to fish out more current, relevant info. This is critical, folks. It’s like upgrading from a library card to the entire internet. RAG is like the sidekick that whispers the correct answers, making LLMs brighter and more in tune with the ever-changing world.

Now, let’s not overlook vector databases. Imagine taking complex, messy data and turning it into a neat, tidy line of dots (vectors). That’s what these databases do. They’re like the Marie Kondo of data, organizing and tidying up so LLMs can find precisely what they need, exactly when they need it. This isn’t just about keeping things neat; it’s about making sure our LLM pals speak sense, reducing those awkward moments when they go off the rails.

So, while LLMs are nothing short of a technological marvel, they’re not quite the oracle of Delphi.

So, while LLMs are nothing short of a technological marvel, they’re not quite the oracle of Delphi. They’re powerful tools, but they need a guiding hand (hello, governance) and a pinch of innovation (cheers, RAG) to truly shine. It’s like we’re standing on the shoulders of digital giants, peering into the future. Just remember, when you ask them the meaning of life, take their answers with a grain of salt.

If you’re interested in using LLM’s in your digital products and services, my team at POWER SHIFTER Digital would be more than happy to have a conversation.

--

--

JP Holecka
Simply Product

CEO, Founder of POWER SHiFTER Digital, Serial Entrepreneur, Noobie Knife Maker, & Leather Crafter with one foot in the future & the other in the analog past.