GENERATIVE AI and ChatGPT TECHNOLOGY ARTICLES
ChatGTP and the Generative AI Hallucinations
ENGLISH — LEARN ABOUT ChatTGPT TECHNOLOGY AND APPLICATIONS
ChatGTP and the Generative AI Hallucinations
A risk from Generative AI is called Hallucination.
One of the biggest concerns of Generative AI systems is when they don’t understand the questions, they misinterpret and since they can’t generate correct answers, they start to invent them in a process called “Artificial Intelligence Hallucination”(AI Hallucination State).
Hallucination is the term employed for the phenomenon where AI algorithms and deep learning neural networks produce outputs that are not real, do not match any data the algorithm has been trained on, or any other identifiable pattern.
It cannot be explained by your programming, the input information, other factors such as incorrect data classification, inadequate training, inability to interpret questions in different languages, inability to contextualize questions.
Hallucinations can happen on all kinds of synthetic data like text, image, audio, video and computer code.