Things To Avoid Asking AI Chatbots.

Charles Luguda
The Modern Scientist
3 min readApr 29, 2023

--

AI chatbots are fun but still have some flaws.

AI chatbot image.

In recent days AI chatbots have become very popular, even if you live under a rock then you will have heard about AI chatbots or even use them. Using AI chatbots is fun but sometimes has very serious consequences. So it is good to know what things we should not ask chatbots.

  1. Medical Diagnosis.

AI chatbots are good at many things but providing accurate medical diagnoses isn’t one of them.

A chatbot just summarizes the info it has pulled from the internet that might or might not be sure. But more important it doesn’t know your medical history or tolerance to certain medicine and hence cant judge what kind of treatment would be right for you.

2. Product Review.

One of the things to avoid asking chatbots is to give you a review of products because what it does is you give the specs and price of the product but not the review of that product. Because a product review by definition includes personal opinions, experience, and judgment of the review.

So they can’t give your a clear review about the in-hand feel of the product you want to buy, or the clarity of the speaker.

3. News.

You should also not use chatbots as your source of news because using chatbots are the source of news poses major problems such as accountability, and context.

Accountability . When you read a piece of news you know the publisher and the journalist behind it. This means that there is someone accountable for the given to you, so there is a clear incentive to be accurate to maintain one’s reputation and credibility.

But the chatbot just summarizes the news from different online website and pull it to you, so there is no primary source and no one is responsible if the news is false.

Context. News you are reading in chatbots is just a summary of info from various sources on the internet, so it can quickly give you a false narrative because you lose the deeper context of each story. The bot doesn’t know if what it’s claiming is true, it’s only blending things in a way that seems true at the surface level.

4. Legal advice.

You might have deadlines that ChatGPT successfully cleared US law school exams, and that AI chatbots are on their way to enter our courtroom in the coming years. This should be shocking, but a quick search would tell you that we are been trying to get AI into our legal system since 2015. Some companies have already created specialized AI legal assistants, even marketing them as an affordable alternative to human lawyers. The concept has merit for people who can’t afford the latter, but the problems with these so-called AI lawyers are too real to ignore.

The obvious difference is that AI chatbots lack common sense instinctively obvious to us that humans need to be programmed into chatbots, which makes them less capable of logical reasoning.

More importantly, a human lawyer can search for new evidence and think outside the box to come up with clever solutions and loopholes. But an AI chatbot can only use the data you provide and process it in a pre-determined manner. An AI lawyer would also be the perfect target for hackers to steal the sensitive data of many people at once.

5. Commercial content.

The biggest selling point of AI chatbots is that they can produce content instantly. What would take a human a couple of hours to write, ChatGPT can do in seconds.

Even if you overlook the questionable ethics of using AI chatbots to write content, you can’t ignore that they are simply not good enough and can’t replace human writers. Not yet, at least, ChatGPT for example, often deliver inaccurate, outdated, and repetitive content.

Since these chatbots don’t have their own opinions, judgments, imagination, or prior experience, they can’t draw intelligent conclusions, think of examples, use metaphors, and create narratives in a way a professional human writer can.

In other words, it’s not adding any new value to the internet. Using chatbots for personal use is perfectly fine. But using them to write your articles, news stories, social media captions, web copies, and more is probably a very bad idea.

--

--

Charles Luguda
The Modern Scientist

I'm a tech student. I write creative articles about technology and life.