• @Sanctus
    link
    English
    410 months ago

    Offline AI do exist that could facilitate NPC interaction. We know they won’t use them and will go with OpenAI, but its possible. I see Indie games nailing it while Triple As fumble to monetize it.

    • @[email protected]
      link
      fedilink
      English
      110 months ago

      I hope offline LLMs see some improvement soon, especially with regards to VRAM usage.

      My 1080ti (11gb VRAM) struggles with anything over about 6B parameters, which in my opinion is unusable. Honestly, even 13B is borderline for most models I’ve used.

      Of course, the other solution would be to increase VRAM on cards, but the GPU manufacturers don’t seem to be on board with that idea.

      • @Sanctus
        link
        English
        310 months ago

        Maybe with some more maturity we’ll see this become viable. Because you’re right. If we hypothetically did it right most machines would be unable to run it from the VRAM consumption alone. Always online is an option to offload that, but that breaks my code of ethics and I refuse to make online only games unless its an MMO.

        • conciselyverbose
          link
          fedilink
          210 months ago

          As a dev, you could do a game with these features and a subscription, while also making the code available to self host and host for others.

          You could also configure it in such a way that much of the generation isn’t in real time, so periodic connectivity is needed for the feature to work correctly, but persistent access is not, and have reasonable fallbacks so the game still works without it. You could even make the protocols open even if the source isn’t to make it easy for others to replace your server as tech develops.

          Most of the things leading to always online connections to publisher servers are deliberate choices to exert control, not things that can’t be done in other ways.