It makes sense if you’re a brand wanting to make more money from people who will just buy the latest thing, regardless of whether it’s a big enough upgrade or not.
It also makes sense if you’re a brand that has been selling an outdated 8 year old brick that reached its peak usefulness years ago to the point that literally any upgrade at all would be worth it lol.
I don’t really follow consoles, but I’ll take a guess based on what limited information is about the thing in the article.
If you figure that PC and various console hardware has converged to a fair degree over the decades and that stuff is gonna get generally ported around anyway, it’s hard to differentiate yourself on game selection or hardware features. Plus you’ve got antitrust regulators going after console vendors buying games to be exclusives, and that also tamps down on that.
So okay, say what you can compete on is in significant part how you run what is more or less the same set of games. Most games already have rendering code that can scale pretty well with hardware for the PC.
It might make sense to make sure that you have faster rendering hardware so that it’s your version that looks the nicest (well, or at least second nicest, hard to compete with the PC’s hardware iteration time for users willing to buy the latest-and-greatest there).
Let me extrapolate one further. It might even make sense, if that’s the direction of things, for console vendors to make some kind of cartridge containing the GPU, something with a durable, idiot-proof upgrade that you don’t have to open the console to do, and to let users upgrade their console to the next gen mid-lifecycle at a lower cost than getting a new console. Controllers haven’t changed all that much, and serial compute capabilities aren’t improving all that much annually. The thing that is improving at a good clip is parallel compute.
Having an APU is part of how they get the price points they do. A separate GPU would cost more on its own, would need its own memory instead of the shared pool, costing more, and the end result would be meaningfully more expensive for customers to upgrade than it is for them to sell their old system and buy a pro.
(It would never cross my mind that backwards compatibility would be seen as a bad point… Or at least that is what I am afraid Sony starts to believe lol).
Honestly, I would get a Steam Deck first, and that wasn’t my line of thinking back then… But guess which or the two units are sold in Mexico ¯_(ツ)_/¯
Why not? The ps5 is good at pretending to do 4k, but is very much on the anemic side of graphics power for 4k gaming. Why wouldn’t people want a performance bump if it’s available and they can afford to upgrade?
That’s what you put in the next console generation, or your PC, not a console that was already released. That’s not how consoles work.
The whole point of a console Vs just building a PC is to have a homogenous ecosystem for developers to ensure that everybody has the exact same experience, because everyone has the exact same device with the same CPU, GPU, etc, across that whole generation (also allowing developers to hone their skill on that hardware over the years to get more out of it).
If you’re going to take that core benefit away, why not just build a PC at that point…
Well we don’t even know the specs yet so any technological improvements are speculation. Either way I think it would be out of place for the console manufacturers not to release a new model after several years. Anyone buying a PS5 for the first time may be put off by a 3 year old console and prefer a newer model.
If it was a technological leap it would be a ps6, as it is the ps5 is an underwhelming console.
While I agree that there could be some unannounced leap it is entirely unlikely, more likely they will implement some form of DLSS. All the while graphics and FPS are not really a concern for ps5 owners, moreso the range of games.
Honestly I would be down for a $1500+ gaming console if it could perform close to a modern gaming PC. Not sure why they can’t offer levels/tiers of the console for people who have more disposable income.
Xbox tried it, tbh it was nice but cutting your market in half creates a knock on affect in your cost of supplies. It also affects the game production with some consoles having issues and others not.
I was always a PC gamer but now having a nice TV in the basement the console is just a more natural choice. I tried using Steam big picture among other options but it still doesn’t fit as well as a console does for me. Xbox had different tiers of consoles? Guess I didn’t research Xbox much and haven’t had one for ages.
I still don’t see why this console makes any sense.
It makes sense if you’re a brand wanting to make more money from people who will just buy the latest thing, regardless of whether it’s a big enough upgrade or not.
It also makes sense if you’re a brand that has been selling an outdated 8 year old brick that reached its peak usefulness years ago to the point that literally any upgrade at all would be worth it lol.
Is this a tedious “PC Master Race” comment?
I don’t know, do you see a reference to PCs anywhere in it?
I don’t really follow consoles, but I’ll take a guess based on what limited information is about the thing in the article.
If you figure that PC and various console hardware has converged to a fair degree over the decades and that stuff is gonna get generally ported around anyway, it’s hard to differentiate yourself on game selection or hardware features. Plus you’ve got antitrust regulators going after console vendors buying games to be exclusives, and that also tamps down on that.
So okay, say what you can compete on is in significant part how you run what is more or less the same set of games. Most games already have rendering code that can scale pretty well with hardware for the PC.
It might make sense to make sure that you have faster rendering hardware so that it’s your version that looks the nicest (well, or at least second nicest, hard to compete with the PC’s hardware iteration time for users willing to buy the latest-and-greatest there).
Let me extrapolate one further. It might even make sense, if that’s the direction of things, for console vendors to make some kind of cartridge containing the GPU, something with a durable, idiot-proof upgrade that you don’t have to open the console to do, and to let users upgrade their console to the next gen mid-lifecycle at a lower cost than getting a new console. Controllers haven’t changed all that much, and serial compute capabilities aren’t improving all that much annually. The thing that is improving at a good clip is parallel compute.
Having an APU is part of how they get the price points they do. A separate GPU would cost more on its own, would need its own memory instead of the shared pool, costing more, and the end result would be meaningfully more expensive for customers to upgrade than it is for them to sell their old system and buy a pro.
Good point. I’m VERY curious to see what lunar lake can do with DDR6.
It makes sense if you don’t have a PS4.
(It would never cross my mind that backwards compatibility would be seen as a bad point… Or at least that is what I am afraid Sony starts to believe lol).
Honestly, I would get a Steam Deck first, and that wasn’t my line of thinking back then… But guess which or the two units are sold in Mexico ¯_(ツ)_/¯
Why not? The ps5 is good at pretending to do 4k, but is very much on the anemic side of graphics power for 4k gaming. Why wouldn’t people want a performance bump if it’s available and they can afford to upgrade?
I feel like social media just loves to jump on the hate bandwagon, fortunately the real world is nothing like this.
Do you understand why new graphics cards make sense? Or why a new CPU makes sense?
That’s what you put in the next console generation, or your PC, not a console that was already released. That’s not how consoles work.
The whole point of a console Vs just building a PC is to have a homogenous ecosystem for developers to ensure that everybody has the exact same experience, because everyone has the exact same device with the same CPU, GPU, etc, across that whole generation (also allowing developers to hone their skill on that hardware over the years to get more out of it).
If you’re going to take that core benefit away, why not just build a PC at that point…
For what games?
Yeah but this isnt that this is watering down a lineup that historically saw major technology jumps between hardware releases.
Well we don’t even know the specs yet so any technological improvements are speculation. Either way I think it would be out of place for the console manufacturers not to release a new model after several years. Anyone buying a PS5 for the first time may be put off by a 3 year old console and prefer a newer model.
If it was a technological leap it would be a ps6, as it is the ps5 is an underwhelming console.
While I agree that there could be some unannounced leap it is entirely unlikely, more likely they will implement some form of DLSS. All the while graphics and FPS are not really a concern for ps5 owners, moreso the range of games.
Honestly I would be down for a $1500+ gaming console if it could perform close to a modern gaming PC. Not sure why they can’t offer levels/tiers of the console for people who have more disposable income.
Probably thermals. Consoles are basically gaming laptops with a desktop heatsink and internal PSU (most of the time)
Xbox tried it, tbh it was nice but cutting your market in half creates a knock on affect in your cost of supplies. It also affects the game production with some consoles having issues and others not.
I was always a PC gamer but now having a nice TV in the basement the console is just a more natural choice. I tried using Steam big picture among other options but it still doesn’t fit as well as a console does for me. Xbox had different tiers of consoles? Guess I didn’t research Xbox much and haven’t had one for ages.
Yeah in the most recent gen they had series S and series X, the S being a low end new gen and X being the flagship ps5 competitor in spec.
Tbh xbox has terrible naming conventions that make it difficult.
In the 360 era they had multiple options mostly relating to HD size.
I’m guessing the CPU isn’t going to get a big uplift if the PS4 Pro is anything to go by.