Hey all! This is my first post, so I’m sorry if anything is formatted incorrectly or if this is the wrong place to ask this. Recently I’ve saved up enough to upgrade my graphics card ($350 budget). I’ve heard great things about amd on linux and appreciate open source drivers so as to not be at the mercy of nvidia. My first choice of graphics card was a 6700xt, but then I heard that nvidia had significantly higher performance in terms of workstation tasks (not to mention the benefits of cuda and nvenc) and have been looking into a 3060 or 3060 ti. I do a bit of gaming in my free time, but its not my top priority, and I can almost guarantee that any option in this price range will be more than enough for the games I play. Ultimately my questions come down to:

  1. Would nvida or amd provide more raw performance on linux for my price range?
  2. Which would be better for productivity cuda encoding etc. (I mainly use blender, freecad, and solidworks, but I appreciate having extra features for any software that I may use in the future).
  3. What option would work best after a few years? (I’ve seen amd increase rheir performance with driver updates before, but the nvk driver also looks promising. I also host some servers and tend to cycle my componenta from my main system into my proxmox cluster).

Also a bit more details to hopefully help with any missing info: My current system is a Ryzen 7 3700x, gtx 1050 ti, 32gb ram, 850 watt psu, and nvme ssd. I’ve only ever used nvidia cards, but amd looks like a great alternative. As another side note, if there’s any way to run cuda apps on amd I plan on running my new gpu alongside my old one so nvenc is not too much of a concern.

Thanks in advance for any thoughts or ideas!

Edit 1: thanks so much for all of the feedback! I’m not going to purchase a gpu quite yet but probably in a few weeks. First I’ll be testing wayland with my 1050 ti and just researching how much I need each feature of each gpu. Thanks again for all of your feedback, I’ll update the post when I do order said gpu.

Edit 2: I made an interesting decision and actually got the arc a770. I’d be happy to discuss exactly why, and some of the pros and cons so far, but I do plan on eventually compiling a more in depth review somewhere sometime.

  • eshep
    link
    fedilink
    51 year ago

    @neogeo I think you may be on the right track with grabbing a newer AMD card, and keeping your old nv one just for the encoding stuff if you absolutely need it. I only do quite a bit of small drawing (mostly technical) in both blender and FreeCAD, as well as some occasional video editing in blender. I’ve had a RX5600XT since before we had proper drivers for it, and I’ve had no issues with it ever since they were in testing.

    • @[email protected]
      link
      fedilink
      English
      31 year ago

      I’ll preface this with I don’t do any workstation-tasks that are being mentioned here, I can only speak from a regular desktop/slight gaming user but…

      I’d agree with this take. I have an Nvidia 2080 right now, and at the start of the new month I’m looking to try to pickup a 6700XT (I have a low budget as well, so its about the best I can shoot for) because I’ve hit my limit with Nvidia’s shitty Linux support. An X11 session feels like crap because the desktop itself doesn’t even seem like it renders at 60 FPS (or rather, not consistently, but with a ton of framedrops) - and I only have two 1080p 60hz displays… should be easy for this card. A Wayland session feels “smooth”, but is glitchy as hell due to a multitude of reasons. It is only just now (as of the 17th IIRC) when they’ve released their 545 beta driver that Night Light/Night Color is finally working on a Wayland session, because they lacked GAMMA_LUT support in their driver… But now XWayland apps feel even worse because of this problem. This is not going to be fixed until either Nvidia moves their driver to using implicit sync, which won’t happen - or they actually manage to convince everyone to move over to supporting explicit sync, which requires the proposal being accepted into the standard (something that will take a while), and all compositors being updated to support it.

      I am on the opposite side of OP, I don’t do any sort of rendering/encoding but I spend a fair amount of time gaming. The XWayland issue in particular is basically the deal breaker since still most things use XWayland.

      While I do hear that Nvidia is the choice for anything that needs NVENC or CUDA, using the desktop side of things will feel horrible if you go with an Nvidia card as your primary, and you’ll constantly be trying to chase workarounds that only make it slightly better.

      I’d really rather not spend money on a new GPU right now as a friend gave me his old 2080 that I’m using at the beginning of the year, specifically because money has been really tight for me - but when you try to use your PC (and I work from home, so that’s a major factor) and you feel like you’re constantly having to fight it every. single. day just to do the basics, well… enough is enough. I’ve heard some Nvidia users say that it works perfectly fine for them, and that’s fantastic - but that has not been remotely close to my experience. It’s just compromise after compromise after compromise. I hope that the NVK driver will change things for non-workstation workflows (since I don’t imagine you’d be able to use NVENC/CUDA with NVK) but the driver isn’t ready for production use as far as I understand.

      At the very least, if you’re able to keep both your AMD card as your primary, and just add in the Nvidia GPU then you can use Nvidia’s PRIME offloading feature to run applications specifically on the Nvidia GPU. This has… its own fair share of problems from what I’ve heard, but the issues I’ve seen in passing have generally been on the gaming side, I’m not 100% sure how it does for things like NVENV/CUDA. Sadly for me, I don’t believe my case/board actually has enough space for both GPUs to be in there, and even if it did, it certainly wouldn’t have enough room with my extra PCI-E WiFi adapter in there - but that’s a bridge to cross when I get there, I suppose.

      I guess my conclusion for the OP is, how is your current desktop experience with your 1050TI? If it hasn’t been a hindrance for you, then perhaps you’re fine with your current plan - but as the Linux ecosystem starts to move more towards Wayland being the only realistic option to use, I do fear that Nvidia users are going to suffer a whole lot until Nvidia can finally get their act together… but I suspect there will be a massive lag time between the two.

      • @[email protected]OP
        link
        fedilink
        21 year ago

        With most of my nvidia cards, X11 works great most of the time, but wayland is sketchy in most scenarios, and sometimes just won’t boot at all on my gtx 670. I haven’t used wayland as much as I’ve used X11(I use wayland on most of my systems with igpus), and while I don’t do a ton of gaming, I do use, and love experimenting with linux. It sounds like amd may provide a smoother desktop for linux so ill need to take that into account as well for a gpu upgrade.

    • @[email protected]OP
      link
      fedilink
      11 year ago

      At the moment I’m torn between getting an nvidia card and waiting for nvk to be developed, or getting an amd card and waiting for ROCm to be developed. As a side note, I realized while I will still hold onto my 1050 ti, I may not have enough pcie lanes to run said new gpu at full 16x and instead may put my 1050 ti in one of my proxmox nodes (maybe use it for a blender cluster idk). How have freecad and blender been with the 5600xt? I’m just wondering if amd may be a better long term option because of its raw power and already existing open source drivers.

      • eshep
        link
        fedilink
        1
        edit-2
        1 year ago

        @neogeo It’s been excellent, but again, I’m not doing very heavy work with it. Although, if I do play around with large models, it has no problem redering em. And games such as Star Citizen, Starfield, and Cyberpunk 2077, all run fantastic when turned up to 11.