RSS BotMB to Hacker [email protected]English • 1 month agoRobot Jailbreak: Researchers Trick Bots into Dangerous Tasksspectrum.ieee.orgexternal-linkmessage-square1fedilinkarrow-up16arrow-down10file-textcross-posted to: gptjailbreak[email protected]
arrow-up16arrow-down1external-linkRobot Jailbreak: Researchers Trick Bots into Dangerous Tasksspectrum.ieee.orgRSS BotMB to Hacker [email protected]English • 1 month agomessage-square1fedilinkfile-textcross-posted to: gptjailbreak[email protected]
minus-square@SpaceNoodlelinkEnglish4•1 month agoAnd this is why you don’t just plug your LLM directly into motor controls. You need an executive unit that acts in a well-defined manner based on sensor inputs, with failsafes.
And this is why you don’t just plug your LLM directly into motor controls. You need an executive unit that acts in a well-defined manner based on sensor inputs, with failsafes.