Your Voice Assistant is MMETYT

Todd Mozer's Desk
Sensory Perspectives on AI
3 min readJul 15, 2024
Created with DALL·E by A. Adeboje

I love Pi, the voice Assistant from Inflection. It’s extremely helpful to me and has aided in everything from being a knowledgeable tour guide to helping me fix broken products.

I think it’s really accurate and only makes occasional errors. But I’m wrong. Our Voice Assistants using LLMs are Making More Errors Than You Think (MMETYT).

This is a well-known phenomenon, but I couldn’t remember the name, so I asked Pi:

“There’s a syndrome that’s known because when a person looks at answers from devices and sees errors in the things that they are an expert in, they think those are the only errors, and they don’t extrapolate those errors to things they don’t know about. What is this syndrome called?”

Pi did a great job responding to my poorly worded question and said:

“The phenomenon you’re referring to is called the Gell-Man Amnesia effect, named after physicist Murray Gell-Man. This effect describes the tendency of people to notice errors in media or information sources regarding subjects they are knowledgeable about, but then fail to apply the same level of skepticism or critical thinking when consuming information about topics they are less familiar with. In other words people tend to compartmentalize their skepticism and forget to apply it universally, leading to a potential bias in their evaluation of information sources.”

Occasionally I notice Pi making big errors. In talking to her about the Sagrada Familia in Barcelona, she told me Michelangelo built part of it. I think this was an error occurring from a probabilistic word sequence. She started talking about tiles which led to Italian tiles, and then the Italian Renaissance artist, Michelangelo.

Sometimes asking the LLM to fact check and confirm what they said works to catch these mistakes. Other times it doesn’t. LLMs were generally poor at math, but now PI seems flawless. I noticed where she’s really bad is palindromes, where the letters going backward are the same as forwards. Pi can tell me famous palindromes that exist but when I ask her to make one up it invariably fails. I asked her to make up a palindrome about Pi, and she said:

“A man, a plan, a Pi, an AMA, Panama! This palindrome is a play on the famous “A man a plan a canal Panama”. “

I asked: “Are you sure that’s right?” and Pi insisted: “Yes I’m confident that the palindrome I provided is correct. A palindrome is a word or phrase or sequence that reads the same backwards as forwards and “A man, a plan, a Pi, an AMA, Panama!” fits that description.”

When I told her she was wrong she apologized and offered a simple well known palindrome “Taco cat”. I asked her to make up another one from scratch about Pi and to double check to make sure it fit the definitions, and she failed with:

“Pie, fill a pie-lover’s crave. A revel. Cars evolver-pile? A fill, I Pi.”

As LLM’s increasingly are used for information I wonder what percentage of information is incorrect? I guess it’s a low hurdle to be better than today’s popular press, but they are Making More Errors Than You Think!

--

--