Does anyone have benchmarks on how much better m2 is at gaming?

Specifically I’d like to see benchmarks for native games, and rosetta games like frostpunk, which seem to be cpu bound quite heavily.

I have found lots of individual benchmarks, but no direct comparison.

  • @damipereiraOP
    link
    English
    11 year ago

    Nice, looks like there’s no artificial limitation, and it’s just a matter of horsepower. I’d love to know the difference between m2 max and m2 ultra on this, since if it’s cpu bound they should perform the same (a game won’t use that many extra cores)

    • [email protected]M
      link
      English
      2
      edit-2
      1 year ago

      Yeah I wouldn’t expect the difference between 12 and 24 CPU cores or whatever it is would be significant here. But Apple also doesn’t advertise the clock speeds and such anymore so it’s hard to reason about the single-core performance difference, if any at all. Maybe a browse through Geekbench results could shed some light. Unfortunately I don’t have an M2 Max machine around to test it, but I could give it a spin on my M1 Max MBP if that’s helpful to you.

      • @damipereiraOP
        link
        English
        21 year ago

        That’s the one I have! And it only runs to around 30fps, with marked slow downs when the temperature drops, it’s barely playable. I’m not going to upgrade to an m2 max macbook, but I was just hoping a possible m3 max might be good enough for the type of games I’m interested and that I would not need a gaming pc, so I wanted to see the improvement from m1 to m2.

        • [email protected]M
          link
          English
          11 year ago

          Wow, I’m surprised it’s that much of a difference. What resolution are you running at?

          • @damipereiraOP
            link
            English
            2
            edit-2
            1 year ago

            I think I was running at native resolution, which is much less than what you’re running at.

            Edit: Ok, I actually tried running it again, on 1080p, and it’s faster than I remember, maybe there was some optimization along the way or something happened. I still get really bad frame drops when the temperature drops. It runs at 45fps until it drops to 10fps for like 10/15 seconds while the temperature drops.

            I actually checked cpu and gpu usage, and the cpu is not max at all, but the gpu is at 100% all the time, that might mean that a faster cpu might indeed be useful. Maybe at around 45fps it starts to become cpu bound?

            Edit 2: on native resolution it runs at 30fps, and drops to 4fps when temperature drops.

            Edit 3: Running at 960x600 resolution keeps it running at 50fps, even when the temperature drops. GPU usage is still at maximum.

            Edit 4: Found the culprit. Global illumination is the setting that made the fps drop so hard when temperature drops. Can you check if it happens for you as well on m2 ultra?

            • [email protected]M
              link
              English
              11 year ago

              I notice improved performance with that off, but only by using the Metal FPS overlay and observing that it drops from like 100+ to ~60 FPS. It’s not really noticeable on a 60hz monitor either way.

              • @damipereiraOP
                link
                English
                11 year ago

                So an m2 ultra gets 100+ fps at 1080? Nice! Yeah makes sense that it is not noticeable. I tried the game on crossover, and that setting does not cause slowdowns there. So it’s a metal bug on frostpunk code.

                • [email protected]M
                  link
                  English
                  11 year ago

                  Oh, haha. No, it gets 60-80 FPS with occasional peaks at 100+ at 5120x2160 :-), so I’m sure it would run far better at 1080P!