I think you could say "Graphics" have plateaued. Not really any reason to differentiate between Console or PC really.
I mean, sure, they have gotten better recently, but it's been by extremely small increments. The latest advances have mostly been in frame generation, which hasn't really been about fidelity or better resolution, it's been about boosting frame rate.
Years ago, it was a GPU for playing at 640x480 vs 1024x768 vs ... all the way up to 4K - and that was the game. All the way until maybe the last two generations. You chased resolution, that increased fidelity, and you found some middle ground with an acceptable frame rate. There were some toys that played into that along the way - anti-aliasing was probably the biggest one, but by and playable large screen resolution was the biggest difference between a budget GPU and a top tier GPU
Today - there's pretty much just 2 resolutions. Resolutions other than 1080 and 4K exist, but are definitely in the minority. 1440, Ultrawides, 8k etc. But pretty much everyone is assuming it's either going to be ultra high frame rate 1080, or 4k - and it's just a matter of how well we can cheese the AI/frame generation to get frame rates to something playable.
The game shifted from actual fidelity to cheating fidelity.
So yeah, I think they are right.