- cross-posted to:
- fuck_ai
- [email protected]
- cross-posted to:
- fuck_ai
- [email protected]
Anthropic created an AI jailbreaking algorithm that keeps tweaking prompts until it gets a harmful response.
You must log in or register to comment.
Anthropic created an AI jailbreaking algorithm that keeps tweaking prompts until it gets a harmful response.