• @andrewth09
    link
    English
    620 hours ago

    I don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.

  • @[email protected]
    link
    fedilink
    English
    522 hours ago

    I don’t understand why we keep caring about this. AI “guard rails” will never be 100%. But beyond that everything is a google search away. Why does this matter?

  • @[email protected]
    link
    fedilink
    English
    71 day ago

    So it’ll tell you how to make weapons and stuff like that? The instructions to make a pipe bomb and a shotgun are freely available, and even if they weren’t, it’s not hard to figure out.

    I don’t understand his anxiety.

    • @ChicoSuave
      link
      English
      11 day ago

      It’s for the people who can’t figure it out and the only thing preventing them from using the knowledge poorly are clear instructions.