722
I truly don't know how to explain this to anyone who wasn't around then.
(startrek.website)
!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.
1. Keep it civil.
Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.
2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.
I should not need to explain this one.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Try not to repost anything posted within the past month.
Beyond that, go for it. Not everyone is on every site all the time.
Logo uses joystick by liftarn
It is my opinion that we reached peak graphics 6 or 7 years ago when GTX1080 was king. Why?
Graphics today seem ass-backward to me: render at 60...70% scale to have good framerates, FX are often rendered at even lower resolution, slap on overly blurry TAA to hide the jaggies, then use some upsample trickery to get to the native resolution. And it's still blurry, so squirt some sharpening and noise on top to create an illusion of detail. And still runs like crap, so throw in frame interpolation to get the illusion of higher frame rate.
I think it's high time we should be able to run non-raytracing graphics at 4k native and raytracing at 2.5k native on 500€ MSRP GPU-s with no trickery involved.
We peaked when we had full hd. After all what could top full high definition... fuller high definition? That would just be silly.
GPUs are getting better, but the demand from the crypto and ML AI markets mean they can just jack up the price of every new card to higher than the last so the prices have stopped dropping with each new generation.
Intel saving us with their gpu prices, too bad they didn't made good drivers YET
If truly believe what you wrote, then you should never look into the details of how a game world is rendered. It's fakery stacked upon fakery that somehow looks great. If anything, the current move of ray tracing with upscaling is less fakery than what was before.
There's a saying in computer graphics: if it looks right, it is right. Meaning you shouldn't worry if the technique makes a mockary of how light actually works as long as the viewer won't notice.
That's the point
Sure, all graphics is about creating an illusion.
But there's a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware. DLSS has it's place for 150...200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ "midrange" cards.
There is not a stark difference if you were to describe the techniques objectively and not twist it to what you feel they're like.
There are so many steps in the render pipeline where native resolution isn't used. Yet I don't here the crowd complaining about shadow map size or how reflections are half res. Upscaling is just another tool that allows us to create better looking frames at playable refresh rates. Compare Alan Wake or Avatar with DLSS with any other game without DLSS and they will still come out on top.
Just because you're unhappy with Nvidia's pricing strategy doesn't mean you should slander new render techniques. You're mixing two different topics.