PinnedMichael WoodinCubed100% Accurate AI Claimed by Acurai — OpenAI and Anthropic Confirm Acurai’s DiscoveriesAcurai’s audacious claims to have discovered how LLMs operate are now confirmed by studies conducted by OpenAI and Anthropic.Aug 2619Aug 2619
PinnedMichael WoodinCubedEliminating Hallucinations Lesson 1: Named Entity Filtering (NEF)Named Entity FilteringSep 210Sep 210
PinnedMichael WoodCreating Accurate AI: Coreference Resolution with FastCorefIntroductionOct 15, 20231Oct 15, 20231
PinnedMichael WoodGPT 4 Hallucination Rate is 28.6% on a Simple Task: Citing Title, Author, and Year of PublicationThe all-too-common myth of GPT 4 having only a 3% hallucination rate is shattered by a recent study that found GPT 4 has a 28.6%…Jun 262Jun 262
PinnedMichael WoodStop Saying RAG Solves Hallucinations — You’re Hurting The AI IndustryToo many companies (and data scientists) are claiming that RAG eliminates hallucinations. Consider the leading providers of legal research…Jun 291Jun 291
Michael WoodThank you for your insightful response.You are correct that I am referring to 100% hallucination elimination within a specific context. However, I wouldn't define the context as…3d ago13d ago1
Michael WoodinCubedBeware Microsoft’s “New” AI CorrectionMicrosoft’s new AI Correction service can potentially increase hallucination rates; rather than decreasing them—independent research shows.5d ago15d ago1
Michael WoodinCubedEliminating Hallucinations Lesson 1a: Source Code for Named Entity Filtering (NEF)Here is the code needed to implement production-ready Named Entity Filtering (NEF) discussed in Hallucination Elimination Lesson One.Sep 171Sep 171
Michael WoodinCubedOpenAI’s o1 Model is a DisasterBefore you buy the inevitable hype regarding the brand new o1 model, read OpenAI’s stunning admission in the o1 System Card (page 5)…Sep 1210Sep 1210
Michael WoodRespectfully, no.In fact, you can empirically demonstrate this yourself by replicating the demonstrations regarding magnesium and calcium in the video…Sep 51Sep 51