PinnedMichael WoodMyth: GPT 4 Hallucination Rate is Only 3%A Google search for “GPT 4 Hallucination Rate” reveals many sites regurgitating the false notion that GPT 4’s hallucination rate is only…Apr 262Apr 262
PinnedMichael WoodEliminate Hallucinations — Yes, Eliminate HallucinationsWhat if AI hallucinations only appear to be unfixable because the industry has been looking in the wrong place? Below are public excerpts…Mar 19Mar 19
Michael WoodExcellent article on how to implement standard chunking techniques.The basic methodology is to first assign each sentence an index number. Then process the sentences to transform them into independent…3d ago23d ago2
Michael WoodAnother great, useful article.We've found a third search/ranking method to also be essential—a method based on synonyms. Although it is true that "cat" and "feline" are…5d ago15d ago1
Michael WoodI used calcium and magnesium as an example because GPT-3.5This demonstration reveals a remarkable discovery: LLMs often mistake semantic similarity with synonymy. However, each LLM makes this…Aug 15Aug 15
Michael WoodGreat article.For example, if the user asks about magnesium and calcium in a single query, we send two queries to the LLM and two sets of passages (one…Aug 141Aug 141
Michael WoodWhy is ChatGPT Getting Worse? — Secret RevealedDiscover the secret reason why ChatGPT is getting worse over time. This article discloses:Aug 13Aug 13
Michael WoodI think you greatly diminish your excellent arguments by downplaying the significance and degree of…That's a tremendous hallucination rate on a relatively simple task. However, what Marcus & Chomsky somehow overlook is that LLMs can…Aug 121Aug 121
Michael WoodEliminate RAG Hallucinations with a Single APIDevelopers can eliminate RAG hallucinations by making a single API call at the end of their pipeline. This article explains how.Aug 7Aug 7