• @chrash0
    link
    English
    157 months ago

    there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

    not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

    • @UnderpantsWeevil
      link
      English
      -17 months ago

      AI generally means machine learned neural networks these days

      Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can’t imagine needing that kind of configuration for my internet browser.

      not sure how they’re going to handle low-resource machines

      One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn’t any better. Do I really want Firefox chewing hundreds of MB of memory so it can… what? Simulate a 600 processor cluster doing weird finger art?

      • @chrash0
        link
        English
        107 months ago

        i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in

        • @UnderpantsWeevil
          link
          English
          -17 months ago

          I don’t doubt it’s possible. I’m just not sure how it would be useful.

      • @iopq
        link
        English
        77 months ago

        I use my local machine for neutral networks just fine