But hey, I’ll split the difference. Instead of SMB 1, which was a launch game and literally wasn’t running on the same hardware (because mappers), we can do Mario 3 instead.
Or, hear me out, let’s not do a remaster at all for current gen leaps. Here’s a PS4 vs PS5 sequel one.
It doesn’t work as well, though, since taking the absolutely ridiculous shift from 2D to 3D, which has happened once and only once in all of gaming history, is a bit of a cheat anyway.
Oh I don’t care about leap comparisons, was just taking interest at how graphics have evolved over time. To be honest graphics have been going downhill for a few years now in big games thanks to lazy development chasing “good” graphics, fucking TAA…
I agree that it’s a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.
I don’t know that I agree on the rest. I don’t think I’m aware of a lazy game developer. That’s a pretty rare breed. TAA isn’t a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.
I do believe we are in a very weird quagmire of a transitional period, where we’re using what is effectively now a VFX suite to make games that aren’t meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It’s a mess out there and it’ll continue to be a mess, because the days of a 1080Ti being a “set to Ultra and forget it” deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.
It’s not the only time we’ve been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it’s up there.
TAA is absolutely a bad thing, I’m sorry, but it’s way worse than FXAA, especially when combined with the new ML upscaling shit.
It’s only really a problem with big games or more specifically UE5 games as temporal is baked into it.
Yeah, there was that perfect moment in time where you could just put everything max, have some nice SMAA on and be happy with >120fps. The 4K chase started yeah, but the hardware we have now is ridiculously powerful and could run 4K 120fps no problem natively, if the time was spent achiveing that rather than throwing in more lighting effects no one asked for and then slapping DLSS on at the end to try and reach playable framerates, making the end product a blurry ghosting mess. Ugh.
Why not do apples to apples?
A cutscene isn’t the best representation. This shows off the 8-bit vs 16-bit better.
I mean, the original image is a cutscene, so…
But hey, I’ll split the difference. Instead of SMB 1, which was a launch game and literally wasn’t running on the same hardware (because mappers), we can do Mario 3 instead.
Or, hear me out, let’s not do a remaster at all for current gen leaps. Here’s a PS4 vs PS5 sequel one.
It doesn’t work as well, though, since taking the absolutely ridiculous shift from 2D to 3D, which has happened once and only once in all of gaming history, is a bit of a cheat anyway.
Oh I don’t care about leap comparisons, was just taking interest at how graphics have evolved over time. To be honest graphics have been going downhill for a few years now in big games thanks to lazy development chasing “good” graphics, fucking TAA…
I agree that it’s a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.
I don’t know that I agree on the rest. I don’t think I’m aware of a lazy game developer. That’s a pretty rare breed. TAA isn’t a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.
I do believe we are in a very weird quagmire of a transitional period, where we’re using what is effectively now a VFX suite to make games that aren’t meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It’s a mess out there and it’ll continue to be a mess, because the days of a 1080Ti being a “set to Ultra and forget it” deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.
It’s not the only time we’ve been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it’s up there.
TAA is absolutely a bad thing, I’m sorry, but it’s way worse than FXAA, especially when combined with the new ML upscaling shit. It’s only really a problem with big games or more specifically UE5 games as temporal is baked into it.
Yeah, there was that perfect moment in time where you could just put everything max, have some nice SMAA on and be happy with >120fps. The 4K chase started yeah, but the hardware we have now is ridiculously powerful and could run 4K 120fps no problem natively, if the time was spent achiveing that rather than throwing in more lighting effects no one asked for and then slapping DLSS on at the end to try and reach playable framerates, making the end product a blurry ghosting mess. Ugh.