← Back to context

Comment by jayd16

4 days ago

You're comparing the art(!) of two different games, that targeted two different sets of hardware while using the ideal hardware for one and not the other. Kind of a terrible example.

> You're comparing the art(!)

The art direction, modelling and animation work is mostly fine, the worse look results from the lack of dynamic lighting and ambient occlusion in the Oblivion Remaster when switching Lumen (UE5's realtime global illumination feature) to the lowest setting, this results in completely flat lighting for the vegetation but is needed to get an acceptable base frame rate (it doesn't solve the random stuttering though).

Basically, the best art will always look bad without good lighting (and even baked or faked ambient lighting like in Skyrim looks better than no ambient lighting at all.

Digital Foundry has an excellent video about the issues:

https://www.youtube.com/watch?v=p0rCA1vpgSw

TL;DR: the 'ideal hardware' for the Oblivion Remaster doesn't exist, even if you get the best gaming rig money can buy.

  • > …when switching Lumen (UE5's realtime global illumination feature) to the lowest setting, this results in completely flat lighting for the vegetation but is needed to get an acceptable base frame rate (it doesn't solve the random stuttering though).

    This also happens to many other UE5 games like S.T.A.L.K.E.R. 2 where they try to push the graphics envelope with expensive techniques and most people without expensive hardware have to turn the settings way down (even use things like upscaling and framegen which further makes the experience a bit worse, at least when the starting point is very bad and you have to use them as a crutch), often making these modern games look worse than something a decade old.

    Whatever UE5 is doing (or rather, how so many developers choose to use it) is a mistake now and might be less of a mistake in 5-10 years when the hardware advances further and becomes more accessible. Right now it feels like a ploy by the Big GPU to force people to upgrade to overpriced hardware if they want to enjoy any of these games; or rather, sillyness aside, is an attempt by studios to save resources by making the artists spend less time on faking and optimizing effects and detail that can just be brute forced by the engine.

    In contrast, most big CryEngine and idTech games run great even on mid range hardware and still look great.

    • It's like (usable) realtime global illumination is the fusion power of rendering, always just 10 years away ;)

      I remember that UE4 also hyped a realtime GI solution which then was hardly used in realworld games because it had a too big performance hit.

  • I haven't really played it myself but it sounds like from the video you posted the remasters a bit of an outlier in terms of bad performance. Again it seems like a bad example to pull from.