I’ve been playing with the largest models I can get running and have been using Librewolf or Firefox, but these use several gigabytes of system memory. What options exist that have less overhead? I’m mostly looking at maximizing the model training potential as I’m learning. The obvious solution is python in a terminal, but I need a hiking trail not free solo rock climbing.

    • @[email protected]
      link
      fedilink
      English
      7
      edit-2
      11 months ago

      Absolutely right. I just tried it on the browsers installed on my system, loading this page:

      Firefox: 560MiB
      Epiphany (GNOME Web): 226MiB
      elinks: 16MiB
      lynx: 14MiB

      Looks like lynx is the winner

      (Sidenote: This isn’t really a fair fight for Firefox since it’s my daily driver, with extensions installed and a bunch of stuff cached. I’m guessing even a fresh install wouldn’t get below 300MiB, though)

      • @j4k3OP
        link
        English
        1
        edit-2
        11 months ago

        deleted by creator

  • Bloody Harry
    link
    fedilink
    6
    edit-2
    11 months ago

    Not what you’re asking for, but how about putting the web browser and the page rendering on a different machine? This way your main machine can focus on calculating.

    Edit: If the pages are super simple, there’s “web browsers” which do work on the command line which can render simple pages in a very crude way.

  • AggressivelyPassive
    link
    fedilink
    111 months ago

    There’s a reason these browsers use that much memory. Something in living there and that’s not just overhead. You can’t realistically reduce that by a reasonable amount by just using another browser while retaining functionality.