• @[email protected]
    link
    fedilink
    English
    4
    edit-2
    2 days ago

    The problem with the B580 is the huge issue with driver overhead leading to worse performance when paired with lower or even midrange CPUs. Which is exactly what you’d usually pair it with

    • .Donuts
      link
      English
      32 days ago

      What’s a 8700K considered nowadays? And care to elaborate what driver overhead means? A link would also be fine, just trying to inform myself as much as possible :)

      • @[email protected]
        link
        fedilink
        English
        5
        edit-2
        2 days ago

        hardware unboxed for example did some benchmarks on the topic a few days ago. The issue wasn’t noticed at launch, where everyone tested with high end processors to eliminate any bottlenecks, but has recently been discovered.

        I would say a 8700k is maybe lower midrange considering its been a while since it was released? Not sure if someone else tested it with older Intel CPUs, since here it is mostly with AMD stuff, but the problem still applies.

        • .Donuts
          link
          English
          3
          edit-2
          2 days ago

          Thank you so much, will check that as soon as I can

          Edit: that was really useful, turns out older CPUs are not so feasible with Arc GPUs. Here’s a summary that I found quite simple and elegant from the comment section:

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            2 days ago

            Yep, that’s pretty much the gist of it. Driver overhead isn’t something completely new, but with the B580 it certainly is so high that it becomes a massive problem in exactly the use case where it would make the most sense.


            Another albeit smaller issue is the idle power draw. Here is a chart (taken from this article)

            Because for a honest value evaluation that also plays a role, especially for anyone planning to use the card for a long time. Peak power draw doesn’t matter as much imo, since most of us will not push their system to its limit for a majority of the time. But idle power draw does add up over time. It also imo kind of kills it as a product for the second niche use besides budget oriented games, which would be for use in a homelab setting for stuff like video transcoding.


            So as much as i am honestly rooting for Intel and think they are actually making really good progress in entering such a difficult market, this isn’t it yet. Maybe third time’s the charm.