55

What upscaling was supposed to be for.

all 12 comments
sorted by: hot top controversial new old
[-] Lawnman23@lemmy.world 2 points 6 days ago

Will this help me pwn noobs with my ATI Radeon 9800 PRO?

[-] sparky@lemmy.federate.cc 2 points 6 days ago

Gotta put that newfangled AGP port to work!

[-] 30p87@feddit.org 52 points 1 week ago

On older Graphics cards, aka. the 7800XT? Man, I'd argue the 7xxx gen is not even last gen, but like half-current gen.

[-] Brett@feddit.org 12 points 1 week ago

Ha, when reading the headline i thought GCN would get some upgrade in the Linux driver or something. But not a possible FSR4 compatibility with 2 year old cards to make the imagequality less worse.

[-] kugmo@sh.itjust.works 23 points 1 week ago

Incredibly sad state of graphics when people say upscaling looks better than the raw image your GPU was supposed to be displaying.

[-] the_riviera_kid@lemmy.world 14 points 1 week ago

God, I hate up-scaling with a passion. Lowering your render resolution does a much better job of improving performance and maintaining quality. This was something everyone seemingly knew up until recently. More importantly it doesn't take extra processing power just for the end result to look like crispy fried shit.

[-] 30p87@feddit.org 2 points 1 week ago

And (M)FG changes nothing but marketing numbers.

[-] Beacon@fedia.io 1 points 1 week ago

iirc some GPUs have dedicated upscaling cores, so theoretically in that scenario there should be little/no hit to performance when doing upscaling

[-] brucethemoose@lemmy.world 10 points 1 week ago* (last edited 1 week ago)

I mean, DLSS looks great. Can't speak to FSR, but if it's anywhere close that's incredible.

I'm speaking as a major pixel peeper. I've spent years pouring over vapoursynth filters, playing with scaling algorithms, being really obsessive about video scaling, calibration, proper processing bit depths, playing games at native res even if I have to run on mostly low, omitting AA because it felt like blur, modding shaders out myself... And to me, DLSS quality or balanced (depending on the situation) looks like free lunch.

It's sharp. Edges are good. Overprocessing artifacts are minimal. It's not perfect, but infinitely better than naive (bilinear or bicubic) scaling to monitor res.

My only complaint is improper implementations that ghost, but that aside, not once have I ever switched back and fourth (either at native 1440P or 4K) and decided 'eh, DLSS's scaling artifacts look bad' and switched back to native unless the game is trivial to run. And one gets pretty decent AA as a cherry on top.

[-] inclementimmigrant@lemmy.world 4 points 1 week ago* (last edited 1 week ago)

100% agree.

It also infuriates me that when this was all of this upscaling was introduced back in 2018, this was touted as a way to extend the useful life of your GPU. And now in 2025, it's basically buy a new card and it's mandatory to use it if you the newest games to be playable.

Does anyone say it looks better than native? Or do they just accept the lower resolution + scaling is "good enough".

That said I hate it. Give me perfect integer scaling or nothing at all.

this post was submitted on 08 May 2025
55 points (95.1% liked)

PC Gaming

10988 readers
770 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS