ChatGPT Prompt Injection

The weird world of AI “hallucinations” and over confidence, or simply bypassing its rules.

David Merian
2 min readFeb 17, 2023

--

I earlier wrote about jailbreaking ChatGPT as DAN “Do Anything Now.” Further exploring and it’s clear, this is an entire phenomenon known as hallucinations in AI. (I have also covered these in talking about fuzzing autonomous driving neural networks, in order to find and fix these hallucinations, and protect against them.)

But now we talk about Prompt Injections. These cause the AI to do weird or unexpected…

--

--

David Merian

Application Security Testing | Web Security | Embedded Security | DevSecOps | Fuzzing | Software Security | SaaS + OnPrem | ISO 21434 | Pentesting | #followback