Yes, I’m sure the phds and senior SWEs/computer scientists working on LLMs never considered the possibility that arbitrary code execution could be a security risk. It wasn’t the very first fucking thing that anybody involved thought about, because everybody else but you is stupid. 😑
It runs in a sandboxed environment anyways - every new chat is its own instance. Its default current working directory is even ‘/home/sandbox’. I’d bet this situation is one of the very first things they thought about when they added the ability to have it execute actual code
Do you think this is a lesson they learned the hard way?
Yes, I’m sure the phds and senior SWEs/computer scientists working on LLMs never considered the possibility that arbitrary code execution could be a security risk. It wasn’t the very first fucking thing that anybody involved thought about, because everybody else but you is stupid. 😑
they may be dumb but they’re not stupid
It runs in a sandboxed environment anyways - every new chat is its own instance. Its default current working directory is even ‘/home/sandbox’. I’d bet this situation is one of the very first things they thought about when they added the ability to have it execute actual code