• 𝒍𝒆𝒎𝒂𝒏𝒏
      link
      fedilink
      337 months ago

      There are also DLL mods that convert nvidia’s DLSS API to AMD FSR, in which case games usually need to be fooled into thinking the GPU is made by nvidia and not AMD

      • DacoTaco
        link
        37 months ago

        How well does that work? I know fsr performs less good, so im wondering if that also effects the quality result like fsr does ( i’d assume so )

        • 𝒍𝒆𝒎𝒂𝒏𝒏
          link
          fedilink
          107 months ago

          I use it on the deck - it works really well, however you can definitely see the artifacting when fast motion is occuring. There are also some odd bugs when using SMAA with FSR turned on, where the frame gen model gets confused and starts moving the game UI/HUD with the camera.

          Apparently it works much better at framerates above 60FPS since the model has more data to predict future frames…

          If you have genuine DLSS available it’s probably better to stick to that IMO

          • DacoTaco
            link
            47 months ago

            Thanks, so it kinda works how i expected it :) Still cool to see!

      • Dran
        link
        19
        edit-2
        7 months ago

        That is usually more incompetence than malice. They write a game that requires different operation on amd vs Nvidia devices and basically write an

        If Nvidia: Do x; Else if amd: Do Y; Else: Crash;

        The idea being that if the check for amd/Nvidia fails, there must be an issue with the check function. The developers didn’t consider the possibility of a non amd/Nvidia card. This was especially true of old games. There are a lot of 1990s-2000s titles that won’t run on modern cards or modern windows because the developers didn’t program a failure mode of “just try it”

        • @[email protected]
          link
          fedilink
          137 months ago

          This is actually more stupid because it’s literally Intel’s fault

          Their own fucking XeSS crashes on their own fucking GPUs under Linux so you have to fake the GPU and beg for it to not actually recognize it’s Intel.

          • I Cast Fist
            link
            fedilink
            37 months ago

            “Powerful graphics cards? Psshhh, who’ll ever need those?” - Intel, from 1990 to 2015

    • @[email protected]
      link
      fedilink
      207 months ago

      I was also confused about this. I didn’t find an article talking about yet either. My current assumption is that it is just this game and that it is still a closed pre-release. So they maybe just wanted to remove noise from people that don’t have the required specs during their testing phase. It’s also not a lock to a specific GPU vendor as I originally expected from the title.