Researchers at ETH Zurich created a jailbreak attack that bypasses AI guardrails


$ 113,065.5
€ 97,059.1
¥ 38,800.0
£ 84,084.4
BTC
-0.55 %
$ 4,249.33
€ 3,647.81
¥ 1,458.38
£ 3,158.99
ETH
1.04 %
$ 265.68
€ 227.79
¥ 91.08
£ 197.35
XMR
0.98 %
$ 114.65
€ 98.43
¥ 39.32
£ 85.28
LTC
-0.40 %

Artificial intelligence models that rely on human feedback to ensure that their outputs are harmless and helpful may be universally vulnerable to so-called ‘poison’ attacks.