691
submitted 4 months ago by mr_MADAFAKA@lemmy.ml to c/pcgaming@lemmy.ca
you are viewing a single comment's thread
view the rest of the comments
[-] PenisWenisGenius@lemmynsfw.com 58 points 4 months ago* (last edited 4 months ago)

I'm generally opposed to anything that involves buying new hardware. This isn't the 1980s. Computers are powerful as fuck. Stop making software that barely runs on them. If they can't make ai more efficient then fuck it. If they can't make game graphics good without a minimum of a $1000 gpu that produces as much heat as a space heater, maybe we need to go back to 2000s era 3d. There is absolutely no point in making graphics more photorealistic than maybe Skyrim. The route they're going is not sustainable.

[-] reev@sh.itjust.works 27 points 4 months ago* (last edited 4 months ago)

The point of software like DLSS is to run stuff better on computers with worse specs than what you'd normally need to run a game as that quality. There's plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity "need" or "should want" is a weird take ¯\_(ツ)_/¯

[-] warm@kbin.earth 10 points 4 months ago* (last edited 4 months ago)

The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.

I don't want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.

The point is, we would rather have games looking like Skyrim with great performance vs '4K RTX real time raytracing ultra AI realistic graphics wow!' at 60fps.

[-] nekusoul@lemmy.nekusoul.de 7 points 4 months ago* (last edited 4 months ago)

The quality of games has dropped a lot, they make them fast

Isn't the public opinion that games take way too long to make nowadays? They certainly don't make them fast anymore.

As for the rest, I also can't really agree. IMO, graphics have taken a huge jump in recent years, even outside of RT. Lighting, texture quality shaders, as well as object density and variety have been getting a noticeable bump. Other than the occasional dud and awful shader compilation stutter that has plagued many PC games over the last few years (but is getting more awareness now) I'd argue that game performance is pretty good for most games right now.

That's why I see techniques like DLSS/FSR/XeSS/TSR not as crutch, but as just as one of the dozen other rendering shortcuts game engines have accumulated over the years. That said, it's not often we see a new technique deliver such a big performance boost while having almost no visual impact.

Also, who decided that 'we' would rather have games looking like Skyrim? While I do like high FPS very much, I also do like shiny graphics with all the bells and whistles. A Game like 'The Talos Principle 2' for example does hammer the GPU quite a bit on its highest settings, but it certainly delivers in the graphics department. So much so that I've probably spent as much time admiring the highly detailed environments as I did actually solving the puzzles.

[-] warm@kbin.earth 2 points 4 months ago* (last edited 4 months ago)

Isn't the public opinion that games take way too long to make nowadays? They certainly don't make them fast anymore.

I think the problem here is that they announce them way too early, so people are waiting like 2-3 years for it. It's better if they are developed behind the scenes and 'surprise' announced a few months prior to launch.

Graphics have advanced of course, but it's become diminishing returns and now a lot of games have resorted to spamming post-processing effects and implementing as much foliage and fog as possible to try and make the games look better. I always bring Destiny 2 up in this conversation, because the game looks great, runs great and the graphical fidelity is amazing - no blur but no rough edges. Versus like any UE game which have terrible TAA, if you disable it then everything is jagged and aliased.

DLSS etc are defo a crutch and they are designed as one (originally for real-time raytracing), hence the better versions requiring new hardware. Games shouldn't be relying on them and their trade-offs are not worth it if you have average modern hardware where the games should just run well natively.

It's not so much us wanting specifically Skyrim, maybe that one guy, but just an extreme example I guess to put the point across. It's obviously all subjective, making things shiny obviously attracts peoples eyes during marketing.

[-] nekusoul@lemmy.nekusoul.de 3 points 4 months ago* (last edited 4 months ago)

I see. That I can mostly agree with. I really don't like the temporal artifacts that come with TAA either, though it's not a deal-breaker for me if the game hides it well.

A few tidbits I'd like to note though:

they announce them way too early, so people are waiting like 2-3 years for it.

Agree. It's kind of insane how early some games are being announced in advance. That said, 2-3 years back then was the time it took for a game to get a sequel. Nowadays you often have to wait an entire console-cycle for a sequel to come out instead of getting a trilogy of games on during one.

Games shouldn’t be relying on them and their trade-offs are not worth it

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

[-] warm@kbin.earth 2 points 4 months ago

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

Trade-offs being the artifacts, while not that noticable to most, I did try it and anything in fast motion does suffer. Another being the hardware requirement. I don't mind it existing, I just don't think mid-high end setups should ever have to enable it for a good experience (well, what I personally consider a good experience :D).

[-] UnderpantsWeevil@lemmy.world 3 points 4 months ago

We should have stopped with Mario 64. Everything else has been an abomination.

this post was submitted on 17 Jul 2024
691 points (99.0% liked)

PC Gaming

8615 readers
705 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS