• @[email protected]
    link
    fedilink
    English
    158 months ago

    But AI has no actual intelligence of it’s own. It’s not going to magically just figure things out. All it can do is spit back what it has been fed.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      8 months ago

      That’s 100% not how AIs work. Not even LLMs.

      The whole point of AIs is they work beyond their training data. Otherwise they couldn’t do anything.

    • @cm0002
      link
      English
      -38 months ago

      All it can do is spit back what it has been fed.

      Those who say these things severely underestimate what AI is capable of or will be in short order or just don’t understand how they work and why.

      But setting that aside, I’m not saying we’ll be able to feed AI a raw decompiled firmware and have it spit out a fully functional emulator in an hour.

      But, in the near future we might be able to feed it raw decompiled firmware and it’ll be able to map proprietary undocumented syscalls in a few minutes, that would be a big chunk of work that could take months of not years

      A decent AI model could significantly lower the barrier to entry for emulator development from “A handful of elite hackers and programmers”

      • @Cypher
        link
        English
        38 months ago

        I see you don’t understand what an LLM is, how they operate or comprehend the kind and volume of training data that is required.

        • @[email protected]
          link
          fedilink
          English
          -28 months ago

          So you do not believe AI will eventually be able to near instantly code anything you desire including an emulator?

          • @Cypher
            link
            English
            38 months ago

            With current models? No. See my points above especially the one about the volume of data required.

            Reverse engineering firmware is extremely niche, even more so for emulation. There are so few examples that current AI models wouldn’t have enough training data to work off of.