Hi everyone, I hope this community will help me. I recently got a new PC, and was excited to try a few games on it. My setup is a MSI katana 17 B13VGK, with a RTX4070, i7 13620H, and 32Go RAM. By my understanding, I should be able to run this with maxed out settings in 1080p without any issue, when I tried RDR2, I was barely getting 10 fps mean on the performance test. The game is set to my dedicated GPU and not the integrated one, and any tweaks I tried (deleting pipeline files) did not change anything.

If you have any ideas, I’m open to anything that could help me enjoy this game ! Thanks in advance

  • Majorllama
    link
    English
    2
    edit-2
    6 days ago

    It’s been a long time since I did any laptop gaming but you need to be plugged into power for the dedicated GPU to run I think. Aside from that I would check all the power settings for the system. Make sure it isn’t in some eco mode or anything else silly.

    I have a desktop with roughly equivalent hardware that I played RD2 on in 1440p and I was able to crank everything and still have 60+ fps consistently. Its been awhile so I forget the exact my numbers but I think in most areas I was able to stay over 100. It’s really only the cities and other densely populated areas where my fps drop down a bit.

    Edit: I forgot to ask if you have run any benchmarks in other games or if you have only tried RD2. At least that way you can determine if it’s a red dead 2 issues or a system wide issue.

    • @[email protected]OP
      link
      fedilink
      English
      16 days ago

      Thanks for answering! I kept my laptop plugged in, and as for other games I was able to run Helldivers 2 or Sea of Thieves both absolutely maxed out on 1080p with more than 100 fps.

      This is why I asked here, it really seems there’s a specific issue with rdr2 that I can’t seem to figure out

      • Majorllama
        link
        English
        26 days ago

        My only other guesses would be switching it from running on vulkan to direct x.

        Or–> Settings --> Graphics --> Video --> Output Adapter (toggle).

        If it’s on 0 switch to 1 or vice versa.

        • @[email protected]OP
          link
          fedilink
          English
          16 days ago

          Thank you again for your time! The output adapter is set to 1, the Nvidia GPU. As for DirectX, it starts with an error telling me to update drivers, even though I ensured they all were set to the latest version… If you have any other questions, or idea, I’m up for it!