this post was submitted on 03 Oct 2024
687 points (98.2% liked)
RetroGaming
19627 readers
757 users here now
Vintage gaming community.
Rules:
- Be kind.
- No spam or soliciting for money.
- No racism or other bigotry allowed.
- Obviously nothing illegal.
If you see these please report them.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's interesting how much technology has slowed down. Back in the 80s and 90s a 5 year old game looked horribly outdated. Now we're getting close to some 20 year old games still looking pretty decent.
Technology has slowed down, but there's also diminishing returns for what you can do with a game's graphics etc.
You can think of sampling audio. If I have a bit depth of 1, and I upgrade that to 16, it's going to sound a hell of a lot more like an improvement than if I were to upgrade from 48 to 64.
I think something worth noting about older games too is that they didn't try and deal with many of their limitations head on. In fact many actually took advantage of their limitations to give the feeling of doing more than they actually were. For example, pixel perfect verus crt. Many 8 bit and 16 bit games were designed specifically for televisions and monitors that would create the effect of having more complexity than they were actually capable of. Other things like clever layout designs in games to limit draw distance, or bringing that in as a functional aspect of the game.
The technical limitations seem largely resolved by current technology, where previously things were made to look and feel better than the hardware allowed through clever planning and engineering.
Oh, absolutely this. I think the YouTube channel GameHut is a great example of the lengths devs went to to get things working. In Ratchet & Clank 3, Insomniac borrowed memory from the PS2's second controller port to use for other things during single-player (PS2 devs did so much crazy shit that within the PCSX2 project, we often joke about how they "huffed glue"). The channel Retro Game Mechanics explained and the book "Racing the Beam" have great explanations for the lengths Atari devs had to go to just to do anything interesting with the system. Even into the seventh generation of consoles, the Hedgehog Engine had precomputed light sources as textures to trick your brain.
Heeeyyy buddy, wass up didn't expect to find you around here! And yeah. Rachet also has some ass backward stuff with The way it tries to force 60 FPS all the time which Ironically made it run worse in PCSX2 for the longest time till more accurate timings for the EE were found.
Oh shit, hey Beard. I didn't expect to see you here either. For that matter I didn't think anyone else surrounding the project used Lemmy. Cool to know I'm not alone.
Hell yeah! I think Kam might be around here somewhere but not a hundred percent on that. Ofc, Rachet is a good example. But we all know the real insanity is Marvel Nemesis xD
I assume this was supposed to say “more noticeable,” not “less”:
Ah, yep. lmao
Very possibly generative AI will alleviate this, although it has yet to produce convincing 3d models or animations.
We haven't slowed down. We simply aren't noticing the degrees of progress, because they're increasingly below our scale of discernment. Going from 8-bit to 64-bit is more visually arresting than 1024-bit to 4096-bit. Moving the rendered horizon back another inch is less noticeable each time you do it, while requiring r^2 more processing power to process all those extra assets.
The classic games look good because the art is polished and the direction is skilled. Go back and watch the original Star Wars movie and its going to be more visually acute than the latest Zack Snyder film. Not because movie graphics haven't improved in 40 years, but because Lucas was very good at his job while Synder isn't.
But then compare Avatar: The Way of Water to Tron. Huge improvements, in large part because Tron was trying to get outside the bounds of what was technically possible long before it was practical, while Avatar is taking computer generated graphics to their limit at a much later stage in their development.
yeah it's like with F1 racing you hit 99% of your min lap time but then it take a million dollars of R&D for each second reduction in min lap time after that.
Popularised by The Law of Diminishing Returns.
Same with movies. LOTR is almost 25 years old and still looks great.
Jurassic Park released in 1993. 31 years ago...
Alien in 1979..
True. Was playing Arkham Knight the other day and thought this nine year old game looked better than at least half of current gen games.
Getting ready for Shadows, eh? At least that's the reason I replayed AK the other day. And Origins. And Asylum. And am in the middle of City.
Last time I was amazed with graphical progress was with Unreal in 1998. And probably just because I hadn't played Quake 2.
From then on until now it's just been a steady and normal increase in expected quality.
Doom 3 might have come close (and damn, that leaked Alpha was impressive) but by the time it was released it looked just slightly better than everything else.
Obligatory:
Hmm I think GTA 3 , as an engine / open world environment was like a whoa moment from me. Then Modern warfare of course . Recently God Of War and Assasin Creed Odysseys rendition of Ancient Greece is quite spectacular.
Remember that one DNF trailer? It looked mindblowingly good back then.
Damn, should have scrolled farther before looking this up myself.