Why Video Game Graphics Seem Like They Stopped Improving Ten years ago, The Witcher 3, Arkham Knight, and Metal Gear Solid V set a new bar for visual fidelity. Fast forward to today, and while games like Cyberpunk 2077 and Horizon Forbidden West are undeniably more detailed, the leap doesn’t feel as monumental as, say, PS2 to PS3. So why does it seem like graphics have stalled? 1. Diminishing Returns The biggest factor is simple: we hit a wall. Early 3D games improved drastically year after year because low-poly models and flat textures left so much room to grow. But once characters stopped looking like blocky mannequins, further refinements such as higher-res textures, slightly smoother animations became subtler. A 50,000-polygon model doesn’t look twice as good as a 25,000-polygon one. At some point, the human eye just stops noticing. 2. The Cost of Realism Modern AAA games chase photorealism, but that’s an exponentially harder target. - Realistic skin requires subsurface scattering. - Believable lighting demands ray tracing. - Dynamic foliage needs physics simulations. Each step forward demands far more processing power for smaller visible gains. Red Dead Redemption 2 still looks incredible, but it took eight years and a half-billion dollars. 3. Art Direction vs. Raw Power Some of the best-looking games today such as Elden Ring and Hades aren’t pushing technical boundaries. They succeed through strong art direction, proving that style ages better than raw fidelity. Meanwhile, many "realistic" games from 2015 (The Order: 1886, Driveclub) still hold up because they prioritized cohesive aesthetics over pure horsepower. 4. The Optimization Crisis Hardware has advanced, but so has developer reliance on brute force. Some games now ship with broken performance, relying on DLSS and FSR to hit playable framerates. 5. The Future: More Than Pixels Graphics have improved...just not in ways that always wow us. Real-time global illumination, path tracing, and AI upscaling are impressive tech, but they don’t make a game feel next-gen the way Crysis once did. Perhaps the next big leap won’t be in visuals at all, but in AI-driven interactivity, physics, or dynamic worlds. Or maybe we’ve just reached the point where better graphics don’t mean better games. And if that’s the case maybe that’s okay.
It takes time to optimize nextgen toolchains; boils down to inflections in hardware lifecycle × AAA development cycle (+/- external-market-forces) as incumbent studios prolong revenue lifecycles via DLC/season-pass, cross-platform, cloud-GaaS/micro-subscriptions × MMO/HFR/HDR/GI... as Freemium/F2P tiers gain mindshare vs AAA biz-models Still recall 100% CPU optimized code hitting 30FPS natively on DOS in retrospect h/t gamedevs maxing-out legacy H/W via innovative S/W-rendering techniques, not long b4 first-gen GPUs began to surface at local computer-shows Virtually no processing/bandwidth constraints rn yet still stuck in the uncanny-valley; latency still critical wrt persistence/IxD (realtime IO) considering Cloud-GaaS' untapped significance for nextgen gamedev (&/or ML >> graphics) ie potentially unlimited memory caps for hi-poly assets + scenes w/ ultra hirez textures (8K120/HDR w/ ~5s load-times, ~15ms latency rn) +/- MMO gaming, realtime DCC/UGC verticals Compounding economies-of-scale via cloud-streaming/GaaS × nextgen H/W abstractions such as app thin-clients also offset traditionally lossy economics of consoles e.g XBL App/SoC dongle optimized to run XBE/x86 &/or integrated Xcloud hypervisor on smart TV × Azure backend
The last true monumental leap in visuals was SD>HD . . after that it's just refinement and graphical style always wins over pushing pixels as Nintendo prove time and time again. If you've got a game at 1080p 60hz then everything is fine. Beyond that very, very, very few people care, and moreover with the younger generation whether a game ray traces every object, every frame is utterly irrelevant. There will likely always be a market for "wow" factor with the hardcore, games like Black Myth Wukong got a lot of attention for its early UE5 demos ; but it does make you question what a PS6 can actually achieve that makes people want to upgrade. 4k or 8k isn't the answer, most people basically can't see it. Hardware innovation needs to be in form factor and the marriage of hardware and software. If a PS6 was a PS5 in Switch form with the haptic qualities of the Dualsense . . I think people would be very happy.
Design matters. Everything else above a certain point (1080p 60 as Richard Browne 🔜 GAMESCOM says) is mostly just specs. Dishonored 1 & 2 / Hades / World of Warcraft / every modern Mario / when people ignore the realism trap and start with style, the game can look good forever.
It’s quite simple an explanation really. The quality trajectory curve starts to level off once you reach a certain point and the time and cost to continue that upwards path towards realism gets harder and harder. The last 20% in photorealism in visual effects can account for a majority of the cost. What you’re seeing are those dminishing returns and the lack of will or budgets to develop game engines to drive competition and innovation.
I think I agree, BUT Stylized games are harder to pull off, because everything has to be designed… same for distant future sci-fi games. I think the only way to stand out in the field these days is to have a signature aesthetic style and to deliver characters and a story that are memorable and that facilitate emotionally engaging game play moments for the player.
Better Graphics never meant better games. The two are not related in that sense. If your game is pure about aesthetics then you wouldn't be caring about realism anyway. Diminishing Returns is basically the only factor worth bringing up because it has been shouted from the rooftops for decades now that chasing realistic graphics will eventually hit diminishing returns so badly that most improvements after that point will be borderline pointless to chase.
And yet, I don't mind firing up an emulator from time to time and replay SNES classics.... Some things are simply not negotiable
I do think we’ve hit a point in graphical fidelity that we don’t need to keep pushing the level of realism for quite some time. Let’s invest in optimization techniques, rather than brute force our way to further and further diminishing returns. “Perfect” is the enemy of “good enough”, after all.
Meanwhile indie games with cartoony graphics do just fine.
Founder & Director at HyperMad interactive - ex Canadian national team athlete in Olympic Taekwondo & Master Instructor.
3moSkill levels are dropping quickly across game dev, music, film, and science. All human fields are seeing a dramatic drop in skill and talent. So it’s more likely something systemic - not specific to video games - like something in our food - or the air we breathe - likely a govt engineered toxin thats designed to make us stupid and weak.