If 9070 XT raytracing on ultra at 1440p without FSR comes close to 5070 Ti for $750, they should price it at $450 if they want to exclude nVidia from all considerations and conversations. If rasterization is 5080 level but raytracting is 5070 performance, they have to undercut 5070 since raytracing and path tracing is the future.
Radeon can’t sell by only making them $100 cheaper because that lets nVidia set the market price. Radeon needs to spend 2 to 3 generations establishing a name. Even though Arc cards don’t sell at $250, that is the very reason why a few nVidia owners are following Intel releases. But Radeon selling a $1000 4080 GPU for $800 does not get people’s attention because of the performance difference at that price. But a 4080 level GPU from Radeon for $500 has the potential for some people to dump nVidia overnight.
What has AMD to gain by dumping prices to sooth the nvidia-first crowd?
Getting gamers to at least try a RX GPU for the first time. Both Radeon and nVidia have abandoned the $300 to $400 crowd. The 4060 is a $200 card, If AMD doesn’t stop catering to their stock price they will turn Radeon into a 3rd tier GPU brand, behind Intel Arc in second place sales.
I sincerely don’t know how Ryzen can sit on top as king, as overpriced as Ryzen is but definitely Ryzen is the best, but Radeon doesn’t matter, don’t care, not worth anything. The profits from Ryzen should be spent on Radeon R&D engineering.
ATI, sorry meant AMD, have tried the low price approach. They achieved nothing in terms of market share, the green fans didn’t find lower price and better performance enough to be swayed, and only ended up hurting their bottom line. There is nothing in it for AMD.
I have a 7900 XT (discounted open box item) sitting around and I need to know of it’s the better option instead of waiting for the 9070 XT to release. “Reportedly” attractively priced? Come on, AMD. Drop some info right now.
Will believe when I see it, if I see them, as in available for purchase at MSRP.
as in available for purchase at MSRP.
lol… these peasants are getting uppity imho
Going to be skeptical until I see actual MSRP and gameplay benchmarks for both raster and RT.
But will it have enough RAM
It says 16gb of vram in the first line of the article. My 8gb kills me. Its a beast of a card, buy as soon as i go over the vram limit, it slows to a crawl.
Their top tier 7800 XTX had 24GB. Most AI models need at least 24 but preferably 32. Guess they don’t need to try when NVIDIA isn’t either, despite not being very expensive to do so.
Most AI models need at least 24 but preferably 32.
Where are you getting this information from? Most models that are less than 16B params will run just fine with less than 24 GB of VRAM. This github discussion thread for open-webui (a frontend for Ollama) has a decent reference for VRAM requirements.
I should have been more specific. The home models that actually compete with paid ones in both accuracy & speed. Please don’t be one of those to exaggerate & pretend it works just as good with much less. It simply doesn’t.
Maybe with CXL or infinityfabric that won’t matter as much.
Don’t know what they’re support looks like on the consumer side though