Xavier FerrerinThomson Reuters LabsPrompt Injection Attacks and How To Defend Against ThemPrompt injection and jailbreaking, while often used interchangeably, are distinct techniques employed to manipulate large language models.Sep 6Sep 6