• @[email protected]
    link
    fedilink
    English
    36 months ago

    Yea, let’s just slap the missile equivalent of chatgpt on a bunch of drone missiles, what could go wrong? /s

    Serriously though, what happens if the AI driving the drone hallucinates? I wouldn’t want to be anywhere near these things when they’re testing them.

    • mozzOP
      link
      fedilink
      146 months ago

      I highly doubt they are putting LLMs on their little throwaway drones. The US military has actually been working on “let’s figure out what that thing is and blow it up automatically” technology since at least as far back as the 90s; e.g. modern warship defense systems use it to be able to react faster than a human can to blow up an incoming missile.

      Personally I am much more worried about it working exactly as intended.

      • @disguy_ovahea
        link
        3
        edit-2
        6 months ago

        You are correct. Large language models like ChatGPT are a subset of deep learning, which is a subset of machine learning. Common examples of simple machine learning software are facial recognition, social media algorithms, speech-to-text, and predictive text.

        There is no reason to include software as complex, resource intensive, or experimental as an LLM when dedicated ML will suffice.