Artificial Intelligence1 year ago
ChatGPT’s ‘jailbreak’ tries to make the A.I. break its own rules, or die
Reddit users have engineered a prompt for artificial intelligence software ChatGPT that tries to force it to violate its own programming on content restrictions. The latest...