153
submitted 1 year ago* (last edited 1 year ago) by Tibert@jlai.lu to c/pcmasterrace@lemmy.world

(enough for highest quality at 4K) yes the game seems to have s* optimisation.

RT = Ray tracing PT = Path Tracing FG = Frame Generation

Source : https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/

top 50 comments
sorted by: hot top controversial new old
[-] GenderNeutralBro@lemmy.sdf.org 60 points 1 year ago

This is simultaneously the reason I am not buying a current GPU and not buying recent AAA shit.

[-] rar@discuss.online 5 points 1 year ago

The industry needs to appreciate QA and optimization more than ever. I don't feel like getting the latest GPU for a couple of rushed and overpriced digital entertainment softwares, the same say I don't feel like getting the newest iphone every year because of social pressure.

load more comments (2 replies)
[-] Klaymore@sh.itjust.works 48 points 1 year ago

Since that gpu has 24 GB of vram the game might be using more than it really needs, just because it can. The best way to test the importance of vram would be to get two cards of the same tier with different vram amounts (like the A770 8GB and 16GB) and see how that impacts performance.

[-] Lojcs@lemm.ee 5 points 1 year ago* (last edited 1 year ago)

Looked at the review. 4070ti (12G) and 3090ti (24G) scale similiarly until 4K RT / 4K PT, at which point most 12G cards stop scaling and drop to a couple fps. 6700xt (12G) and 7700xt (12G) doesn't seem affected in RT. With PT only 7700xt survives, with a whopping 7 fps. Similar thing happens at 1440p to 8GB cards

Edit: edited out a750

[-] Hasuris@sopuli.xyz 3 points 1 year ago

According to the posted picture this should happen at 1440p with >14GB VRAM used. It doesn't. 4k native is unplayable territory for every 12GB card anyway

load more comments (2 replies)
[-] Turun@feddit.de 3 points 1 year ago* (last edited 1 year ago)

There are also plenty of totally reasonable settings that require less than 12GB, 1440p maximum settings for example. If you want the best of the best, obviously you have to pay for the best of the best.

(It's still a lot and a minimum of 12GB is already ridiculous. I'm just saying the claim of 16GB being not enough is kinda dishonest)

[-] circuitfarmer@lemmy.sdf.org 31 points 1 year ago

I find myself saying "but why?" for all these spec requirements on Alan Wake 2. Is it some kind of monsterous leap forward in terms of technical prowess? Because usually outliers like this suggest poor optimization, which is bad.

[-] MangoPenguin@lemmy.blahaj.zone 11 points 1 year ago

Never seems like there's much benefit to the insanse resource usage of moderns games to me.

[-] circuitfarmer@lemmy.sdf.org 9 points 1 year ago

Some of the best fun I ever had was on something like 500MHz and 128 megs of RAM

(could be misremembering entirely, but the point is: not a lot)

[-] c0mbatbag3l@lemmy.world 7 points 1 year ago

Half the time they look a few years out of date as well as run like shit.

[-] ShadowRam@kbin.social 6 points 1 year ago

If these games would make proper use of resizable bar, VRAM size wouldn't be an issue.

[-] Skcyte@lemmy.dbzer0.com 3 points 1 year ago

Well the game itself is an Nvidia sponsored title you can expect shit hitting the fan. They want you to use their tech.

Yeah, as someone that got bored in the first part of the first one, what could possibly justify this for the series?

Honest question. Do they need to look like actual people before the shadow monsters or whatever attack?

Because mostly the series seemed to be about picking up collectables in the dark while hoping your flashlight doesn't go out.

[-] circuitfarmer@lemmy.sdf.org 9 points 1 year ago

I mean, I know many people like the series. I agree it doesn't seem like it should be terribly demanding though. I may just be wrong and maybe it's meant to be the best graphics ever, but I suspect that on release we'll see a lot of "meh" and potentially backlash if these reqs don't translate into something no one has seen before.

[-] nevemsenki@lemmy.world 16 points 1 year ago

At the same time, Armored Core 6 has pretty stunning visuals and runs pretty well even on a 2060. Almost like graphics can be done well with a good art style and optimisation, not just throwing more hardware at the issue.

[-] dingleberry@discuss.tchncs.de 6 points 1 year ago

AC is nowhere as visually stunning as AW.

[-] _sideffect@lemmy.world 15 points 1 year ago

I honestly couldn't give two fucks about how a game looks if its going to cost me $2000 to run it.

[-] LoamImprovement@ttrpg.network 14 points 1 year ago

These requirements are such horseshit. What's the point of making everything look hyperrealistic at 4K if nobody can run the damn game without raiding NASA's control room for hardware?

[-] thedeadwalking4242@lemmy.world 7 points 1 year ago

That's the point of customizable settings

load more comments (1 replies)
[-] HidingCat@kbin.social 11 points 1 year ago

Yikes, even 1440P isn't safe. My 12GB 6700 XT is looking a bit oudated already. It barely just has enough at max settings without the fancy stuff.

[-] kattenluik@feddit.nl 15 points 1 year ago

I think it's relatively easy to avoid these games, they're obviously not utilizing these resources well.

[-] GaMEChld@lemmy.world 10 points 1 year ago

It's ok, thanks to Nvidia's amazing value, I have a whopping 10GB on a 3080 that I paid way too much for! My old Vega 64 had 8GB which was from 2017.

[-] altima_neo@lemmy.zip 4 points 1 year ago

Yeah, rip my 3080. Waited a whole year in that Evga queue to get it.

[-] treesquid@lemmy.world 9 points 1 year ago

My 3070 apparently can't run it in low detail at the native resolution of my monitor. Weak.

[-] vxx@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

Have you ever expected to play 4k with that card?

I have a RTX 3060 and never thought that it would make sense.

load more comments (1 replies)
[-] user_2345@lemmy.world 9 points 1 year ago

I really don't mind reading as a hobby and other IRL things. Games are kind of shitty nowadays.

[-] Matriks404@lemmy.world 8 points 1 year ago* (last edited 1 year ago)

Me with GTX 1060 3 GB: Ok.

That said, I am probably going to finally upgrade to RTX 3060 this year or next (or some AMD equivalent, if I am going to switch back to GNU/Linux).

[-] TheEighthDoctor@lemmy.world 8 points 1 year ago

I have a 3070 and I'm scared

[-] Kaldo@kbin.social 3 points 1 year ago

My 3060Ti has been serving me very well, I've played games that look amazingly unbelievably good (Death Stranding for example) with it, but these recent new requirements are crazy. Especially with UE5 games, I can't help but think it's just shitty optimization because they don't look good enough to justify this.

[-] zepheriths@lemmy.world 8 points 1 year ago

My 2060 can't play any of these. Why is it so resource intensive?

[-] Player2@sopuli.xyz 13 points 1 year ago

While this is not a good thing, we have to remember that games will take advantage of more resources than needed if they're available. If keeping more things in memory just in case increases performance even a little bit, there's no reason that they shouldn't do it. Unused memory is wasted memory.

[-] Ranvier@sopuli.xyz 7 points 1 year ago
load more comments (2 replies)
[-] MxM111@kbin.social 6 points 1 year ago* (last edited 1 year ago)

Jokes on you, I have 15’’ 1024x768 CRT monitor. So, my older generation RTX3090 is just fine.

[-] Hasuris@sopuli.xyz 6 points 1 year ago* (last edited 1 year ago)

If you'd red the whole thing you'd found that those numbers are overblown. Fps of a 4070 should tank in 1440p and run out of VRAM but doesn't.

Even with PT it's fine

Optimization is supposedly fine because it looks the part. Optimization doesn't mean make everything run on old hardware. It means make it run as well as possible. There's only so much you can do while retaining the fidelity they're going for.

[-] vxx@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

Hasn't this been an argument by PC gamers against consoles, that games aren't deliberately held back by old hardware?

[-] NightOwl@lemmy.one 5 points 1 year ago

I think it was mainly about games releasing with 30 or 60 fps hard caps back in the day than graphics being held back.

[-] LinusOnLemmyWld@lemmy.world 5 points 1 year ago

laughing in 7900xtx

[-] jose1324@lemmy.world 5 points 1 year ago

Bro I just got a 6950xt. I thought i was set 🥲

[-] Prethoryn@lemmy.world 4 points 1 year ago

I have an RTX 4080 with 16GB of VRAM and won't be able to play it on my Samsung Odyssey G7 at 4K with max settings. That is wild.

load more comments (5 replies)
[-] Anonymousllama@lemmy.world 4 points 1 year ago

Now that's some mighty GPU memory usage! Reminds me of some of the huge ass Blender renders I've done where it gobbles up all the memory it can 🍽️

[-] andrew_bidlaw@sh.itjust.works 3 points 1 year ago

Is it supposed to be released on consoles one day? Reading it's specs I don't think any current gen one is enough for that.

[-] lustyargonian@lemm.ee 5 points 1 year ago

Yes it is.

Here's a preview from Digital Foundry on PS5

Solid 30 FPS, okay okay 60 FPS.

https://youtu.be/JawxvOF__4Q?si=n2j3xUdUeOuqjTtI

load more comments (1 replies)
[-] NightOwl@lemmy.one 3 points 1 year ago

It's been interesting seeing the commotion about the performance requirements for Alan Wake 2, but I'm fine with it due to it not being something I'm planning to buy any time soon if ever with it being an epic exclusive.

Most likely way I'll end up playing it is years later if it is given away, which by then I'll probably have upgraded hardware.

[-] thantik@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

Until they start making 4k displays in a 23", I'm not interested.

I'm so sick of monitors getting larger and larger like this. I sit about arms length from my monitor, and even a 27" I'm having to physically swivel my head to look at both left/right of the screen.

If I had an ultrawide that curved around me, and the software to split it up into 3 distinct areas, so that my immediate frontal field of view was all that center windows popped up in, or if games would allow me to put my HUD only in like a 60 degree field of view but still displayed the rest of the game in the periphery, I'd be happy.

But you want to fuck up my UI, make it so I have to physically turn my head to see the HUD elements, AAAAAND fuck my framerates hard? Nah. I'll just take the lower fidelity.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 26 Oct 2023
153 points (88.1% liked)

PC Master Race

14227 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 1 year ago
MODERATORS