Are gaming consoles reaching final form? Former PlayStation boss says no more major hardware leaps

midian182

Posts: 10,265   +138
Staff member
Big quote: What will the consoles of the future be like compared to today's machines from Sony and Microsoft? Former PlayStation Studios boss Shawn Layden doesn't believe there will be significant improvements in future generations, certainly nothing like the hardware leap between the PS1 and PS2. He thinks the biggest changes will be found in the games themselves.

Speaking in an interview with Eurogamer, Layden notes how the technological jump from the original PlayStation to the PS2, still the best-selling console of all time, was a dramatic one. He cites his disbelief at seeing the Gran Turismo demo for the first time, something this writer also remembers experiencing.

Layden notes how the move from the PS2 to PS3 introduced the HD standard and 60 FPS in a lot of games, but the move from PS3 to PS4 "was just, like, getting the network thing done right."

As for the PS5, Layden calls it a fantastic bit of kit, "but the actual difference in performance [compared to the PS4]... we're getting to the realm, frankly, where only dogs can hear the difference now." Speaking about generational console hardware upgrades, he said, "we have sort of maxed out there."

Could we ever see a generational improvement in console performance matching that of the PS1 to PS2? Layden doubts it, and definitely not any time in the next few decades." I mean, what would that leap look like? It would be perfectly-realised human actors in a game that you completely control. That could happen one day. I don't think it's going to happen in my lifetime."

Layden believes the innovation curve on hardware is starting to plateau. He also notes that Xbox and PlayStation consoles have pretty much the same chipset built by AMD, adding that each company has its own OS and "proprietary secret sauce," but in essence it's the same. "I think we're pretty much close to final spec for what a console could be."

With console hardware becoming less important, Layden says the real competition will be the content they provide. "I think we're at a point where the console becomes irrelevant in the next... if not the next generation then the next next generation definitely," Layden said.

Layden also believes Sony will continue to keep most of its games exclusive to its console rather than build versions for competing, smaller platforms such as Xbox. While Microsoft has released some of its games on PS5, such as Sea of Thieves, Pentiment, and Hi-Fi Rush, Sony is unlikely to return the favor in the future, as the small sales would not be worth the outcry.

Layden says releasing PC versions of PlayStation games years after they first arrive on the console causes plenty of anger among Sony fans, so Xbox releases from PlayStation Studios would result in even more outcry, potentially harming Sony's brand reputation. "I don't know if the juice is worth the squeeze," he said.

In October, Layden said that the games industry had stopped focusing on making fun games and instead spends all its energy on monetization.

Permalink to story:

 
The lower PS5 sales says it all.

PlayStation's biggest hardware success was the PS4 and since then the PS5 has struggled to keep up even when Xbox has put no competition whatsoever.
 
the move from PS3 to PS4 "was just, like, getting the network thing done right."

Ehm, PS3 was a mess in terms of programmability (CELL), right?
What PS4 brought to the table, really, was x86 architecture and an easier way of creating games for the console.
 
the move from PS3 to PS4 "was just, like, getting the network thing done right."

Ehm, PS3 was a mess in terms of programmability (CELL), right?
What PS4 brought to the table, really, was x86 architecture and an easier way of creating games for the console.

Easier, yes. But from pretty much all reports out there, even when programmed very unoptimally the Cell was still a good 20-30% faster then the PS4's CPU.

Also, moving to X86 is less of an advantage then you might think; even by the PS3s days it was rare you would manually drop in CPU specific opcodes, instead going through higher level libraries. CPU architecture is hidden by your compiler. At most you are changing some driver specific code (which will more often then not be different even within an x86 ecosystem due to differences in the OS, no different then a Windows-Linux port).
 
The lower PS5 sales says it all.

PlayStation's biggest hardware success was the PS4 and since then the PS5 has struggled to keep up even when Xbox has put no competition whatsoever.
Well, unless you NEED raytracing, there is little to no reason to upgrade. Even then, Ray Tracing is more of a "nice to have" type thing than a necessity. And until the cost of GPUs come down it's going to continue to be a "nice to have thing" with developers looking at Ray Tracing only games as alienating potential customers.

Then we have the issue that games just aren't fun anymore. They're filled with MicroTransactions and political agendas. Games used to be synonymous with escapism but even those two words have become incompatible in recent years. Gaming has literally become a fulltime job for many people and the barrier for entry because of it has nearly destroyed gaming as a hobby.

4K gaming with ray tracing isn't going to be possible for consoles anytime soon unless they want to start putting 4090's or an equivalent in there. That right there will take away console gaming's biggest advantage, gaming on a budget. And with maybe 3 games that make paying for a 4090 worth it, who actually wants to pay $2000 for a graphics card? Outside of working profressionals or game streamers, the 4090 is one of the dumbest products I've ever seen regardless of how impressive its performance is.

I think we've gotten to the point where graphics are "good enough" and now we need better games that aren't trying to sell us nonsense at every turn or forcing some agenda down. I remember games like Diablo being magic, golden eye, Age of Empires, FF7(and 10 and kingdom hearts), Fallout New Vegas and even Skyrim to some extent all being magical experiences.

What happened to the magic in video games? I've played games that are "okay" or even good, but I don't think I've had a magical experience in years. The last truly magical experience I had in a videogame was Baulders Gate 3 and it had nothing to do with Ray Tracing or graphics for me. I played that game on medium@4k with FSR on my 6700XT. That's actually one of the games I would spend 4090 money. Since I'm a Linux guy these days the best I'm gonna be able to do is the 8800XT, but as soon as the 8800XT is released I plan on replaying BGS3 and 2077.

But my narrow in on my point, there haven't really been any "magical" experiences in recent years where if you go back 10-15s, nearly everything was some magical experience. And, oddly enough, when I find a 'new to me' game, the ones that I end up binging for 10-15 hours without realizing it are usually from 2010-2017. We need the marketing and HR departments to get out of games. That's why Space Marines 2 has been such a blast. That whole game is an HR Nightmare. I wouldn't call that a "magical experience", but the last time I had both laughed and run in terror that much was on my first New Vegas playthrough.
 
Well, unless you NEED raytracing, there is little to no reason to upgrade. Even then, Ray Tracing is more of a "nice to have" type thing than a necessity. And until the cost of GPUs come down it's going to continue to be a "nice to have thing" with developers looking at Ray Tracing only games as alienating potential customers.

Then we have the issue that games just aren't fun anymore. They're filled with MicroTransactions and political agendas. Games used to be synonymous with escapism but even those two words have become incompatible in recent years. Gaming has literally become a fulltime job for many people and the barrier for entry because of it has nearly destroyed gaming as a hobby.

4K gaming with ray tracing isn't going to be possible for consoles anytime soon unless they want to start putting 4090's or an equivalent in there. That right there will take away console gaming's biggest advantage, gaming on a budget. And with maybe 3 games that make paying for a 4090 worth it, who actually wants to pay $2000 for a graphics card? Outside of working profressionals or game streamers, the 4090 is one of the dumbest products I've ever seen regardless of how impressive its performance is.

I think we've gotten to the point where graphics are "good enough" and now we need better games that aren't trying to sell us nonsense at every turn or forcing some agenda down. I remember games like Diablo being magic, golden eye, Age of Empires, FF7(and 10 and kingdom hearts), Fallout New Vegas and even Skyrim to some extent all being magical experiences.

What happened to the magic in video games? I've played games that are "okay" or even good, but I don't think I've had a magical experience in years. The last truly magical experience I had in a videogame was Baulders Gate 3 and it had nothing to do with Ray Tracing or graphics for me. I played that game on medium@4k with FSR on my 6700XT. That's actually one of the games I would spend 4090 money. Since I'm a Linux guy these days the best I'm gonna be able to do is the 8800XT, but as soon as the 8800XT is released I plan on replaying BGS3 and 2077.

But my narrow in on my point, there haven't really been any "magical" experiences in recent years where if you go back 10-15s, nearly everything was some magical experience. And, oddly enough, when I find a 'new to me' game, the ones that I end up binging for 10-15 hours without realizing it are usually from 2010-2017. We need the marketing and HR departments to get out of games. That's why Space Marines 2 has been such a blast. That whole game is an HR Nightmare. I wouldn't call that a "magical experience", but the last time I had both laughed and run in terror that much was on my first New Vegas playthrough.
There are still good experiences to be had, but they are more likely to appear in indy games. Larger studios are now being run by business-focused people instead of game-focused. They want to have consistently timed releases and they want to hit certain sale targets. This means they are less likely to take chances on new ideas, as the games could be a flop. They are going to stick with their existing IPs and incorporate proven mechanics from other games.

Indy games still have a lot of that magic. They usually make their games with the premise of making games they want to play, which are fun. They are willing to experiment more and the sheer amount of indy developers means someone is going to come up with a new idea that makes a game enjoyable.
 
Ok, then I hope console generational cycles get longer again. PS5 and XSeX should stick around for 7+ years.

Start selling the consoles closer to cost after they've recouped their R&D and turned a profit.

Force devs to have to optimize their engines and games in order to keep bringing the visual improvements rather than just throwing more hardware at marginal visual gains.
 
"In October, Layden said that the games industry had stopped focusing on making fun games and instead spends all its energy on monetization."

That's the problem in a nutshell. The PS5 could have a RTX 4090 in it, wont matter if there are no good games to play. OG star wars battlefront 2 is way more fun than the newest CoD, for example, and runs on potatoes.
Of course, longer hardware cycles are not a problem either. If all of it slows down, that's great! Lets take the time to actually optimize for what we have. Games like DOOM show what can be done even on slow hardware. That's better for the environment, better for our wallets, and better for the community since you wont be splitting up between different consoles.
Easier, yes. But from pretty much all reports out there, even when programmed very unoptimally the Cell was still a good 20-30% faster then the PS4's CPU.

Also, moving to X86 is less of an advantage then you might think; even by the PS3s days it was rare you would manually drop in CPU specific opcodes, instead going through higher level libraries. CPU architecture is hidden by your compiler. At most you are changing some driver specific code (which will more often then not be different even within an x86 ecosystem due to differences in the OS, no different then a Windows-Linux port).
True, but because xx86 is so widespread compilers can product more optimized code, and there are more well documented tricks. Never once in the PS3's lifespan was anyone able to get the full capability out of the thing. IIRC naughtydog claimed they got about 78% with TLOU 1 and that was pushing it.

The CPu cores in the PS4 were weaker but far easier to program software for, so actual real world performance still favors the PS4.
 
The CPu cores in the PS4 were weaker but far easier to program software for, so actual real world performance still favors the PS4.
As someone who did some outsourced work for both: details matter here.

Remember the on-paper maximum performance for the Cell was bonkers; it was *over* twice as fast as the main CPU in the PS4. Even at 60-70% efficiency (which was fairly standard) the Cell ended up faster then whatever the PS4 could do. The real problem with the PS3 wasn't technically the Cell, it was stalling other parts of the pipeline because of how you had to program the thing.

As for compiled x86 code: the advantage from the compiler is less then you think. PowerPC was *very* well understood and optimized by the 2000s, and honestly some of the code put out by early x86-64 compilers that I've seen during the early PS4 era was...questionable at best.

CPU architecture really isn't a major limitation now, and it wasn't a major factor even a decade or two ago. My opinion is the driver interface matters *far* more then the underlying architecture does, as the later doesn't really affect program design to any significant degree.
 
Easier, yes. But from pretty much all reports out there, even when programmed very unoptimally the Cell was still a good 20-30% faster then the PS4's CPU.
Where did you get this idea from? This is complete nonsense.

The only thing Cell was faster than the PS4 CPU on was 32-bit floating-point SIMD operations, because it had accelerators for that (the SPEs in Cell). But by the time the PS4 came out, Cell was already dead, killed by the rise of GPGPU and compute shaders. Cell on the PS3 was capable of about 180 GFLOPS in 32-bit FP SIMD. In comparison, the PS4 GPU was capable of 1840 GFLOPS in 32-bit FP SIMD.

For general-purpose computing, on the other hand, Cell only had one single core (Cell's PPE) that was roughly equivalent in performance to a Pentium 4 CPU. Each individual Jaguar core in the PS4 was also in that same ballpark of performance, only the PS4 had 8 of them.

There was no aspect in which the PS3 was faster than the PS4, the PS4 completely outperforms it across the board. The only thing is that time moved on, Cell was killed by GPGPU, and accelerating SIMD operations moved from specialized CPU units to GPU compute.
 
Easier, yes. But from pretty much all reports out there, even when programmed very unoptimally the Cell was still a good 20-30% faster then the PS4's CPU.

Also, moving to X86 is less of an advantage then you might think; even by the PS3s days it was rare you would manually drop in CPU specific opcodes, instead going through higher level libraries. CPU architecture is hidden by your compiler. At most you are changing some driver specific code (which will more often then not be different even within an x86 ecosystem due to differences in the OS, no different then a Windows-Linux port).

That's a flat out lie.

The CPU in the PS4 is leagues faster than that in order Power PC Chip in the Cell. It is not even close, this is why even the xbox one was able to do call translations from the 360's tri core power chip to the xb1's x86 chip with performance gains at times. The PS3 and 360 used the same Power PC Core, the 360 just had more while the Cell had the SPE's.

The Cell's SPEs are pointless in a modern setting, everything they were good at is done on a modern GPU will handle leagues faster. They are Compute Cores, and really not too far off from what you'd find in a GPU. Sony intended to use the SPE's for graphics, but their ability to be used a primary graphic card did not pan out. The Nvidia GPU was added later in the PS3's development.
 
Well, unless you NEED raytracing, there is little to no reason to upgrade. Even then, Ray Tracing is more of a "nice to have" type thing than a necessity. And until the cost of GPUs come down it's going to continue to be a "nice to have thing" with developers looking at Ray Tracing only games as alienating potential customers.

Then we have the issue that games just aren't fun anymore. They're filled with MicroTransactions and political agendas. Games used to be synonymous with escapism but even those two words have become incompatible in recent years. Gaming has literally become a fulltime job for many people and the barrier for entry because of it has nearly destroyed gaming as a hobby.

4K gaming with ray tracing isn't going to be possible for consoles anytime soon unless they want to start putting 4090's or an equivalent in there. That right there will take away console gaming's biggest advantage, gaming on a budget. And with maybe 3 games that make paying for a 4090 worth it, who actually wants to pay $2000 for a graphics card? Outside of working profressionals or game streamers, the 4090 is one of the dumbest products I've ever seen regardless of how impressive its performance is.

I think we've gotten to the point where graphics are "good enough" and now we need better games that aren't trying to sell us nonsense at every turn or forcing some agenda down. I remember games like Diablo being magic, golden eye, Age of Empires, FF7(and 10 and kingdom hearts), Fallout New Vegas and even Skyrim to some extent all being magical experiences.

What happened to the magic in video games? I've played games that are "okay" or even good, but I don't think I've had a magical experience in years. The last truly magical experience I had in a videogame was Baulders Gate 3 and it had nothing to do with Ray Tracing or graphics for me. I played that game on medium@4k with FSR on my 6700XT. That's actually one of the games I would spend 4090 money. Since I'm a Linux guy these days the best I'm gonna be able to do is the 8800XT, but as soon as the 8800XT is released I plan on replaying BGS3 and 2077.

But my narrow in on my point, there haven't really been any "magical" experiences in recent years where if you go back 10-15s, nearly everything was some magical experience. And, oddly enough, when I find a 'new to me' game, the ones that I end up binging for 10-15 hours without realizing it are usually from 2010-2017. We need the marketing and HR departments to get out of games. That's why Space Marines 2 has been such a blast. That whole game is an HR Nightmare. I wouldn't call that a "magical experience", but the last time I had both laughed and run in terror that much was on my first New Vegas playthrough.
Agree, but many would say that it's all fake news.
 
I have a better question: where are the games at?

The current games are boring. The entire "Variety" of the Xbox 360/ PS3/ Wii is over.
Guitar Hero, Rock Band... flight simulation - more than just shooting at people.
I had WAY more fun on Xbox 360 than Xbox One. Xbox Series X is a joke to me.

Everyone's waiting on pins and needles for GTA6. Other than that: what else is there?

I'm finding myself completely bored with these new games.

 
I have a better question: where are the games at?

The current games are boring. The entire "Variety" of the Xbox 360/ PS3/ Wii is over.
Guitar Hero, Rock Band... flight simulation - more than just shooting at people.
I had WAY more fun on Xbox 360 than Xbox One. Xbox Series X is a joke to me.

Everyone's waiting on pins and needles for GTA6. Other than that: what else is there?

I'm finding myself completely bored with these new games.
I have been posting the same thing time and again. We get maybe 2 or 3 exciting games every year and only 1 of them is a topper. Every publisher wants to put out a fortnite variant.

I am yet to upgrade from my PS4 Pro. Games look 80% as good and I don't have a reason to upgrade yet. And for the same reason, I haven't upgraded my PC either in a while. GTA6 will convince me to get something later next year but at this point, my backlog of older games seem more appealing than the games of last 2-3 years.
 
This is because as Moore's law has slowed, the performance of computer hardware in general has plateaued, especially the hardware the vast majority of games are capable of running on. The hardware consoles use is basically lower-end x86 gaming PC hardware that must be priced relatively low to remain an attractive platform. Consoles will always be beaten by PCs as long as people choose to spend more money on their PC than they would a console.
 
Soooooo

Old timer here. First games were spectrum 48k. Then Amiga. First console was a snes. Megadrive. PS1. Xbox. Xbox 360. PS3. PS4. Xbox One/1X/Series X. Recently a switch and most recently an PS5 I’m really enjoying. All the time with a PC from pentium 200 onwards. Had a voodoo one card in the first one. Always had a 3D card for gaming.

Monetisation is bad but you can see why people do it. Games don’t go on plastic cartridges anymore generally but street fighter 2 on snes cost £69.99 msrp in the 90s and digital games dont cost near that a few months after release. The level of realism people want costs too much to invest and sell copies to a niche audience at £30 a copy and people buy a lot of games on sale. I don’t know how that gets better. Elden Ring is amazing with a single DLC but that level of quality and success is rare. CoD sells millions. EAFC. Huge money generators.

I’m currently playing Ghost of Tsushima on the PS5. It looks amazing and I’ve got access to it through the premium sub. That cost £80 on Black Friday. Less than £10 a month gives me access to loads of games - I bought a 2TB nvme and installed 88 of them. That’s what publishers and developers are up against. A risky unknown un-monitised title looks even less appealing to the accountants.

I don’t agree with him about the hardware not moving now though. Maybe a slower cycle with 4-5 years between devices. And maybe the next ones have longer wait after, but the PS5 and Xbox could both benefit from better CPUs (3D cache could be good). I would like to see 4080 performance on a console in 5 years time. It’ll still be behind the 70 series or whatever nvidia have at that time but it’ll play todays games with frame generation and better upscaling and that will be just lovely thank you.



 
And he's right! Wow, reading all the article, I've never seen even a former executive be so free of bullsh*t and call it the way it is several times! Admirable!!! 👏🏻
 
I have a better question: where are the games at?

The current games are boring. The entire "Variety" of the Xbox 360/ PS3/ Wii is over.
Guitar Hero, Rock Band... flight simulation - more than just shooting at people.
I had WAY more fun on Xbox 360 than Xbox One. Xbox Series X is a joke to me.

Everyone's waiting on pins and needles for GTA6. Other than that: what else is there?

I'm finding myself completely bored with these new games.
I have Series X and PS5 and both of them are by far the most poor consoles with specific games wise. Even hardware wise they are the worst with their SSD tied to chipset(which by their nature will die out at some point).
One of the greatest era was indeed PS360.
 
I've lamented here before about not seeing a reason to upgrade to the PS5 (far less Pro) so I won't rehash it. I'll add the only console I'm considering right now for next generation is the Switch 2... and that's only because of the age of my kids... the Switch being the console we can enjoy together as a family.
 
This guy obviously hasn't jumped on the AI bandwagon. AI. Will. Change. Everything. But he's right, cause the PS6 will definitely have some AI NPU feature designed by AMD.

Photorealistic video games will be on the PS6. And that is the topping out he's referring to.
 
I don’t agree with him about the hardware not moving now though. Maybe a slower cycle with 4-5 years between devices. And maybe the next ones have longer wait after, but the PS5 and Xbox could both benefit from better CPUs (3D cache could be good). I would like to see 4080 performance on a console in 5 years time. It’ll still be behind the 70 series or whatever nvidia have at that time but it’ll play todays games with frame generation and better upscaling and that will be just lovely thank you.

PS6 is looking at a 2027/2028 release. By then it'll be more like 6080 level cards with AI features out the ying yang.

And he's right. Hardware upgrades will pretty much plateau because video games will all rely on AI software, which requires fast networks, and wifi 7 should be standard by then. Wifi 7 has 100x lower latency than wifi 6 and supports 46Gbps, so the rendering that games can do over the network won't require hardware upgrades anymore, a 6080 level graphics card will be plenty. I'm still shocked how good Nvidia's Gefore Now works on my laptop with integrated graphics. I play any game w/ Geforce 4080 Super level detail on my Wifi 6 network. I'm sure Sony will request something similar from AMD for their own gaming service.
 
Back