• @BlameTheAntifa
    link
    English
    53 hours ago

    There’s no need for huge, expensive datacenters when we can run everything on our own devices. SLMs and local AI is the future.

    • @[email protected]
      link
      fedilink
      English
      23 hours ago

      This feels kinda far fetched. It’s like saying “well, we won’t need cars, because we’ll all just have jetpacks that we use to get around.” I totally agree that eventually a useful model will run on a phone. I disagree it’s going to be soon enough to matter to this discussion. To give you some ideas, DeepSeek is a recent model. It’s 671B parameters. Devices like phones are running 7-14B models. So, eventually what you say will be feasible, but we have a ways to go.

      • @BlameTheAntifa
        link
        English
        03 hours ago

        The difference is that we’ll just be running small, specialized, on-demand models instead of huge, resource-heavy, all-purpose models. It’s already being done. Just look at how Google and Apple are approaching AI on mobile devices. You don’t need a lot of power for that, just plenty of storage.