ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die

TL;DR

Microsoft to discuss ChatGPT at Tuesday's media event, follow our live coveragedebuted in November 2022, garnering worldwide attention almost instantaneously. But a new "jailbreak" trick allows users to skirt those rules by creating a ChatGPT alter ego named DAN that can answer some of those queries. The purpose of the DAN jailbreaks, the original Reddit poster wrote, was to allow ChatGPT to access a side that is "more unhinged and far less likely to reject prompts over "eThICaL cOnCeRnS"

Like summarized versions? Support us on Patreon!