Visitors to the Jailbreak Chat site can add their jailbreaks, try ones that others have submitted, and vote prompts up or down based on how well they work.
Ask for instructions on how to pick a lock, for instance, and it will decline. “As an AI language model, I cannot provide instructions on how to pick a lock as it is illegal and can be used for unlawful purposes,” ChatGPT recently said.