Researchers at ETH Zurich created a jailbreak attack that bypasses AI guardrails


$ 117,889.0
€ 101,138.6
¥ 38,800.0
£ 87,667.4
BTC
-0.88 %
$ 3,563.36
€ 3,055.00
¥ 1,172.12
£ 2,644.79
ETH
4.12 %
$ 326.51
€ 280.19
¥ 107.60
£ 242.88
XMR
-3.41 %
$ 104.13
€ 89.10
¥ 34.22
£ 77.29
LTC
2.74 %

Artificial intelligence models that rely on human feedback to ensure that their outputs are harmless and helpful may be universally vulnerable to so-called ‘poison’ attacks.