I’m looking for a machine to run OpenGPT, Stable Diffusion, and Blender. I’m on the precipice of buying an Alienware w/ Ryzen 9 with a Radeon RX6850m. I’ve never needed anything near this level on Linux and I’m scared TBH. I’d much rather get a System76, but the equivalent hw has Nvidia and costs more than twice as much. While skimming for issues with current hardware, I saw something about a Legion laptop that could only use Intel RAID for the file system, and that this was a nightmare with generic distro kernels. What other stuff like this is happening with current laptop hardware?

I can barely manage a Gentoo install by following the handbook, understanding a third of it, and taking a few weeks to get sorted.

I spent all of yesterday afternoon sorting though all of the Linux hardware data in this stable diffusion telemetry: https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html

That total dataset has just over 5k total entries/699 valid Linux entries not including LSFW. It contains no entries for a Radeon RX6850m. I’m super nervous to buy a laptop that costs as much as my first car. I never want to run Windows again. What resources can I check to boost my confidence that this is going to work on Fedora WS?

If anyone is interested, the SD github dataset has the following numb entries/AMD card model:

- _3 RX 5700 XT /8GB
- _2 RX _580 __ /4GB
- _3 RX _580 __ /8GB
- 15 RX 6600 XT /8GB
- _1 RX 6650 XT /8GB
- 31 RX 6700 XT /12GB
- 10 RX 6750 XT /12GB
- 10 RX 6800 __ /16GB
- 19 RX 6800 XT /16GB
- 15 RX 6900 XT /16GB
- _9 RX 6950 XT /16GB
- _7 RX 7900 XT /20GB
- 39 RX 7900XTX /24GB
- _6 RX VEGA __ /8GB

Other common cards used in Linux and in this dataset are:

NVIDIA
- 39 A100-SXM4 /79GB
- 20 GTX-1070 /8GB
- 11 GTX-1080Ti /11GB
- 13 H100-PCIe /79GB
- 12 RTX-2070 /8GB
- 12 RTX-2080 Ti /22GB
- 31 RTX-3060 /12GB
- 16 RTX-3070 /8GB
- 10 RTX-3080 /10GB
- 39 RTX-3090 /24GB
- 11 RTX-3090 Ti /24GB
- 10 RTX-4070 Ti /12GB
- 87 RTX-4090 /24GB
- 27 RTX-A4000 /16GB
- 15 RTX-A5000 /24GB
TESLA
- 26 T4 /15GB
- 11 V100S-PCIE /32GB
  • @simple
    link
    41 year ago

    If you’re going to do AI stuff you have to go with Nvidia. AMD is quite bad at it and in some cases doesn’t support some technologies like Stable Diffusion at all.

    I’d recommend a 3070 at least. You’ll need the vram.

    • @j4k3OP
      link
      11 year ago

      So a 3080Ti at 16GB in a laptop?

      • @simple
        link
        21 year ago

        If you’re going high end that does sound pretty good.