I got 32 additional GB of ram at a low, low cost from someone. What can I actually do with it?

        • @grue
          link
          English
          3
          edit-2
          3 days ago

          In my case, it’s less about being able to open more Firefox tabs and more about Firefox being able to go longer between crashes due to a memory leak. (I know, I know… Firefox doesn’t have memory leaks anymore. It’s probably due to an extension or some bad JavaScript in one of my perpetually-open sites or something. One of these days I’ll get around to troubleshooting it…)

  • @[email protected]
    link
    fedilink
    English
    83 days ago

    Here’s what you can do with your impressive 64 GB of RAM:

    Store approximately 8.1 quintillion (that’s 8,100,000,000,000,000) zeros! Yes, that’s right, an endless ocean of nothingness that will surely bring balance to the universe.

    • @yoevli
      link
      English
      32 days ago

      Unless something’s gone over my head here, this is off by around 6 orders of magnitude.

  • grandel
    link
    fedilink
    344 days ago

    Sell it to somebody at a medium, medium cost who needs it

  • @[email protected]
    link
    fedilink
    314 days ago
    • Compressed swap (zram)

    • Compiling large C++ programs with many threads

    • Virtual machines

    • Video encoding

    • Many Firefox tabs

    • Games

  • @BradleyUffner
    link
    English
    17
    edit-2
    3 days ago

    Keep (checks math) 3 more tabs open in chrome.

    • slazer2au
      link
      English
      174 days ago

      One docker container per VM just to maximise the ram usage.

      • Onno (VK6FLAB)
        link
        fedilink
        174 days ago

        I realise that you are making a joke, but here’s what I used it for:

        • Debian VM as my main desktop
        • Debian VN as my main Docker host
        • Windows VM for a historical application
        • Debian VM for signal processing
        • Debian VM for a CNC

        At times only the first two or three were running. I had dozens of purpose built VM directories for clients, different hardware emulation, version testing, video conferencing, immutable testing, data analysis, etc.

        My hardware failed in June last year. I didn’t lose any data, but the hardware has proven hard to replace. Mind you, it worked great for a decade, so, swings and roundabouts.

        I’m currently investigating, evaluating and costing running all of this in AWS. Whilst it’s technically feasible, I’m not yet convinced of actual suitability.

        • @[email protected]
          link
          fedilink
          English
          84 days ago

          costing running all of this in AWS

          The cost will be oh, so much more than you’re expecting. I have not been at a shop where they didn’t later go “oh shit. Repatriate that stuff so it doesn’t cost us a mint.”

      • @[email protected]
        link
        fedilink
        74 days ago

        I unironically do this in proxmox. Keeps things nice and separate and i still have plenty ram left.

        • slazer2au
          link
          English
          24 days ago

          Any reason for not using LXC as PX has native support?

          • Onno (VK6FLAB)
            link
            fedilink
            54 days ago

            In my case, I’m not a fan of running unknown code on the host. Docker and LXC are ways of running a process in a virtual security sandbox. If the process escapes the sandbox, they’re in your host.

            If they escape inside a VM, that’s another layer they have to penetrate to get to the host.

            It’s not perfect by any stretch of the imagination, but it’s better than a hole in the head.

  • @[email protected]
    link
    fedilink
    English
    123 days ago

    700 Chrome tabs, a very bloated IDE, an Android emulator, a VM, another Android emulator, a bunch of node.js processes (and their accompanying chrome processes)

  • linuxgator
    link
    fedilink
    English
    133 days ago

    You could use it to finally level off that wobbly table in the kitchen.

  • @Jesus_666
    link
    214 days ago

    Run a fairly large LLM on your CPU so you can get the finest of questionable problem solving at a speed fast enough to be workable but slow enough to be highly annoying.

    This has the added benefit of filling dozens of gigabytes of storage that you probably didn’t know what to do with anyway.

  • spicy pancake
    link
    fedilink
    English
    153 days ago

    Fold At Home!

    https://foldingathome.org/

    You can essentially donate your processing power to various science projects that need it to compute protein folding simulations. I used to run it whenever I wasn’t actively using my PC. This does cost electricity and increase rate of wear and tear on the device, as with any sustained high computational load. But it’s cool! :]

    • Rikudou_Sage
      link
      fedilink
      113 days ago

      Does additional 32 GB of RAM actually help there? I’d assume this is mostly CPU-intensive work.

      • spicy pancake
        link
        fedilink
        English
        43 days ago

        looking into it, seems like you’re actually right. looks like it runs best with a solid GPU. there may be other distributed computing projects better suited for abundant RAM.

  • zkfcfbzr
    link
    English
    224 days ago

    I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.

    And, yeah, docker’s always taking up 3-4 GB.

    • Mubelotix
      link
      fedilink
      54 days ago

      Either you use your CPU and RAM, either your GPU and VRAM

      • zkfcfbzr
        link
        English
        23 days ago

        Fair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.

        • @[email protected]
          link
          fedilink
          English
          13 days ago

          You could potentially run some smaller MoE models as they don’t take up too much memory while running. I’d suspect the deepseek r1 8B distill with some quantization would work well.

          • zkfcfbzr
            link
            English
            13 days ago

            I tried out the 8B deepseek and found it pretty underwhelming - the responses were borderline unrelated to the prompts at times. The smallest I had any respectable output with was the 12B model - which I was able to run, at a somewhat usable speed even.

  • @[email protected]
    link
    fedilink
    63 days ago

    The best thing about having a lot of RAM is that you can have a ton of apps open with a ton of windows without closing them or slowing down. I have an unreasonable number of browser windows and tabs open because that’s my equivalent to bookmarking something to come back and read it later. It’s similar to if you’re the type of person for whom stuff accumulates on flat surfaces cause you just set stuff down intending to deal with it later. My desk is similarly cluttered with books, bills, accessories, etc.

    • @scarilog
      link
      12 days ago

      Yeah this is exactly me. Also a quick tip, if you’re on windows, there are some registry tweaks you can do to help prevent the GUI slowing down when lots of programs are open at once.