Thank you so much, will check that as soon as I can
Edit: that was really useful, turns out older CPUs are not so feasible with Arc GPUs. Here’s a summary that I found quite simple and elegant from the comment section:
Yep, that’s pretty much the gist of it. Driver overhead isn’t something completely new, but with the B580 it certainly is so high that it becomes a massive problem in exactly the use case where it would make the most sense.
Another albeit smaller issue is the idle power draw. Here is a chart (taken from this article)
Because for a honest value evaluation that also plays a role, especially for anyone planning to use the card for a long time.
Peak power draw doesn’t matter as much imo, since most of us will not push their system to its limit for a majority of the time. But idle power draw does add up over time.
It also imo kind of kills it as a product for the second niche use besides budget oriented games, which would be for use in a homelab setting for stuff like video transcoding.
So as much as i am honestly rooting for Intel and think they are actually making really good progress in entering such a difficult market, this isn’t it yet. Maybe third time’s the charm.
Thank you so much, will check that as soon as I can
Edit: that was really useful, turns out older CPUs are not so feasible with Arc GPUs. Here’s a summary that I found quite simple and elegant from the comment section:
Yep, that’s pretty much the gist of it. Driver overhead isn’t something completely new, but with the B580 it certainly is so high that it becomes a massive problem in exactly the use case where it would make the most sense.
Another albeit smaller issue is the idle power draw. Here is a chart (taken from this article)
Because for a honest value evaluation that also plays a role, especially for anyone planning to use the card for a long time. Peak power draw doesn’t matter as much imo, since most of us will not push their system to its limit for a majority of the time. But idle power draw does add up over time. It also imo kind of kills it as a product for the second niche use besides budget oriented games, which would be for use in a homelab setting for stuff like video transcoding.
So as much as i am honestly rooting for Intel and think they are actually making really good progress in entering such a difficult market, this isn’t it yet. Maybe third time’s the charm.