391
submitted 10 months ago* (last edited 10 months ago) by Otherwise_Direction7 to c/196@lemmy.blahaj.zone
you are viewing a single comment's thread
view the rest of the comments
[-] funnystuff97@lemmy.world 10 points 10 months ago

On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.

*At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.

[-] Otherwise_Direction7 3 points 10 months ago

Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?

But how? Does the game use FMV for the cutscenes or something?

[-] funnystuff97@lemmy.world 4 points 10 months ago

The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you're in a cutscene-- it was grainier and there was a smidge of artifacting. Don't quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)

Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I'll have to replay that game again sometime to jog my memory.

[-] nevetsg@aussie.zone 3 points 10 months ago

I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.

[-] misterundercoat@lemmy.world 2 points 10 months ago

On PS5 Hogwarts Legacy runs at 60fps but the cutscenes are 30fps.

this post was submitted on 16 Feb 2024
391 points (100.0% liked)

196

16721 readers
2504 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS