• @[email protected]
    link
    fedilink
    English
    381 month ago

    Any processor can run llms. The only issue is how fast, and how much ram it has access to. And you can trade the latter for disk space if you’re willing to sacrifice even more speed.

    If it can add, it can run any model

    • @surph_ninja
      link
      English
      61 month ago

      Yes, and a big part of AI advancement is running it on leaner hardware, using less power, with more efficient models.

      Not every team is working on building bigger with more resources. Showing off how much they can squeeze out of minimal hardware is an important piece of this.

    • @Warl0k3
      link
      English
      1
      edit-2
      1 month ago

      Yeah the Church-Turing thesis holds that you can run an LLM on a casio wrist watch (if for some reason you wanted to do that). I can’t imagine this is exactly what you’d call ‘good’…