Your feed may have recently been flooded with cool, AI-generated images rendered from tools like OpenAI’s DALL-E 2 or other models. The concept of an AI system creating new information (rather than just interpreting it) is called “hallucination”. So, does this mean that we can cause other models like, say…