• @[email protected]
      link
      fedilink
      39 months ago

      tbf you would need a pretty beefy gpu to do both rendering and ai locally.

      as much as i hate to say it (because this idea sounds awesome) the tech is not there yet, and depending on the cloud for this always goes wrong.

      • @cynar
        link
        29 months ago

        I limited LLM would run on a lot of newer gfx cards. It could also be done as a semi online thing. If you have the grunt, you can run it locally. Otherwise, you can farm it out to the online server.