881
you are viewing a single comment's thread
view the rest of the comments
[-] brucethemoose@lemmy.world 10 points 6 days ago* (last edited 6 days ago)

Anyone know if CP2077 runs better on Linux than Windows?

By much? With HDR?

Sorry for the drive by comment, but this is like the one game my 3090 can’t quite handle to my satisfaction. I've thoroughly disabled the thing from rendering in Linux and don’t want to undo all that… But if I could get like another 10% over Windows, that would be incredible. Even 5% would be awesome.

[-] GreenCrunch@lemmy.today 1 points 2 days ago

As I promised, my own Cyberpunk testing of Windows Vs Linux on mostly the same hardware (they are on different SSDs, but I don't think that'll have a drastic impact).

TLDR: Windows framerates seem inconsistent, it's first benchmark I ran (the first Ultra without DLSS) was way faster with no explanation. Aside from that and Ray Tracing: Overdrive, Linux seems to win, and by a large degree (+28 FPS average on the Low preset seems ridiculous).

I don't think these results are broadly applicable to more machines. You probably won't get +28 FPS by switching to Linux.

My best guess is that the performance difference may have a lot to do with different power/thermal targets, or that Windows was doing a lot in the background (it was running an update, but I didn't expect a huge impact).

I'm guessing that on most hardware the performance difference will be pretty small.

Hardware: ROG Zephyrus G15 GA503QR Laptop Ryzen 9 5900HS, 16 GiB DDR4 RTX 3070 Laptop GPU 2560x1440 screen, up to 165 Hz

All benchmarks: plugged into OEM power supply. I held the laptop vertically so there were no restrictions to its airflow.

Game: Cyberpunk 2077 V2.3 with Phantom Liberty DLC, fullscreen 2560x1440. Values are given as Min / Average / Max FPS displayed by the game's built in benchmark.

Linux (Bazzite 42): NVIDIA driver 575.64.05 Samsung 980 Pro 2TB SSD Performance power profile

Low Preset ( no upscaling): 57.49 / 68.42 / 83.86 FPS

Ultra Preset(no upscaling): 32.91 / 39.27 / 49.71 FPS

Ultra (DLSS Transformer model, Auto): 41.11 / 48.70 / 61.30 FPS

Ray Tracing: Low Preset (DLSS transformer model, Auto): 44.12 / 51.70 / 61.63 FPS

Ray Tracing: Ultra Preset (DLSS transformer model, Auto): 29.24 / 34.26 / 39.81 FPS

Ray Tracing: Overdrive Preset (DLSS transformer model, Auto): 15.03 / 17.71 / 20.45 FPS

Windows (Windows 11 Home 23H2): GeForce Game Ready Driver 580.88 SK Hynix HFM001TD3JX013N SSD "Turbo" power profile (in ASUS Armoury Crate)

Low Preset (no upscaling): 35.68 / 40.68 / 45.17 FPS

Ultra Preset(no upscaling): 40.53 / 52.88 / 65 FPS

Ultra Preset (no upscaling, Round 2): 29.68 / 35.63 / 39.94 FPS

Ultra (DLSS Transformer model, Auto): 36.71 / 47.20 / 55.32 FPS

Ray Tracing: Low Preset (DLSS transformer model, Auto): 28.55 / 32.41 / 35.85 FPS

Ray Tracing: Ultra Preset (DLSS transformer model, Auto): 22.23 / 27.25 / 30.86 FPS

Ray Tracing: Overdrive Preset (DLSS transformer model, Auto): 17.74 / 19.96 / 22.64 FPS

[-] woelkchen@lemmy.world 10 points 6 days ago

this is like the one game my 3090 can’t quite handle to my satisfaction

Nvidia and Linux don't have the best history. Their driver are not open source, so Valve developers have no means to improve performance and fix bugs on a driver level.

Success stories of Linux gaming are usually about Radeon and Arc GPUs whose drivers are fully open source.

[-] brucethemoose@lemmy.world 6 points 6 days ago

This is what I was afraid of, and reflects my experience in the past, unfortunately. I am intimately familiar with Nvidia’s drivers and my random Linux black screens…

I would have gotten a 7900 TBH, but prices were terrible at the time.

[-] woelkchen@lemmy.world 9 points 6 days ago

I don't run any hardware with an NVidia GPU on Linux any longer, so I don't have recent first hand experience but I do follow Linux news and every year or so it's announced that Nvidia is working on the last feature that's holding back perfection on Linux. NVidia drivers don't support implicit sync but now that the Linux graphics layer supports explicit sync, the NVidia drivers make the "Final Steps Towards Ultimate Desktop Experience". Same BS every year. Nvidia is always lagging behind on Linux.

I'll consider using NVidia with Linux, should NVidia ever enter upstream kernel and Mesa development the same way AMD and Intel do.

I am intimately familiar with Nvidia’s drivers and my random Linux black screens…

Same here. At one point I was very versed in reinstalling the entire Linux graphics stack because the NVidia driver's kernel module decided that it is no longer compatible with the lastest kernel update.

[-] brucethemoose@lemmy.world 4 points 6 days ago

I’ll consider using NVidia with Linux

Screw Nvidia.

I’d be on AMD if they weren’t price gouging just as bad (or worse), or on Intel if they offered 24GB+ cards for less than a car.

[-] naevaTheRat@lemmy.dbzer0.com 5 points 6 days ago

What? I have a 2060 and shit runs fine. Nvidia's drivers have improved a lot since the 2010s.

[-] woelkchen@lemmy.world 5 points 6 days ago* (last edited 6 days ago)

What? I have a 2060 and shit runs fine.

Of course. There's always the ones for whom everything runs fine. These are the ones who aren't affected by bugs in power management caused by Nvidia drivers because they use desktop PCs and not laptops. These are the ones who still used X11 five years after the rest of the Linux world moved to Wayland and when Nvidia drivers got good enough for Wayland, it's always "see, how much Nvidia’s drivers have improved a lot since the 2010s!!"

Nvidia is lagging years behind on adopting newer technologies in the Linux graphics stack.

Edit: These days it's "HDR can cause game-breaking graphical artifacts".

[-] naevaTheRat@lemmy.dbzer0.com 5 points 6 days ago

I didn't say there are never any issues I said it's fine. The idea that "success stories" are only amd is silly. 90/100 times unless you're using bleeding edge hardware or pathologically fussy you just hit play and stuff works. 9 out of the remaining 10 times you tweak a proton version or wine setting, the other time it's a driver bug.

[-] brucethemoose@lemmy.world 3 points 5 days ago* (last edited 5 days ago)

Sometimes you don't know what you're missing though.

As an example, I figured out (on a 4900HS CPU/2060 GPU) that Stellaris and modded Rimworld game ticks are on the order of 40% slower running linux native, and still slower (but less dramatically so) in Proton. There was zero public information on this until I tested it myself.

As another example, modded Minecraft is dramatically faster on linux.

They run fine, yeah, but one's game settings are kinda capped by CPU performance in all these titles. I don't have to know the difference, but would like to, hence I'm wondering about CP2077 from the opposite side: am I missing out on a boost from linux?

[-] woelkchen@lemmy.world 2 points 6 days ago

The idea that “success stories” are only amd is silly.

Luckily I didn't write that.

[-] brucethemoose@lemmy.world 2 points 5 days ago

Another good nugget, thanks (as I'd like to play in HDR).

[-] Mwa@thelemmy.club 3 points 6 days ago* (last edited 6 days ago)

And Directx 12(VKD3D) as of writing this has issues on Nvidia

[-] brucethemoose@lemmy.world 3 points 6 days ago* (last edited 6 days ago)

Thanks, these are the kind of nuggets I’m looking for.

Not that I blame the vulkan translation layer at all. It’s incredible it even works on Nvidia.

[-] Mwa@thelemmy.club 1 points 5 days ago

Ig the fix came out?

[-] LucidNightmare@lemmy.dbzer0.com 6 points 6 days ago

Hey there! Recently downloaded Cyberpunk again to test my graphics card out.

openSUSE Tumbleweed, a 144hz 1080p ultrawide monitor (21:9), i9-10850K, nvidia 5080, raytracing and all settings on ultra, no DLSS fake frames only DLAA

I was getting from 75-120 (120 could be lower or higher as I can’t get to my computer right now) depending on what was on screen. In the city with lots of neon and ads going while driving around? 75-80 fps

Inside a building or not near any of the reflective causing lights? 90-120

I’m pretty sure my CPU is bottlenecking me for the most part, but it has never sweated on anything I threw at it, so didn’t see the need to upgrade just yet.

Hopefully that helps you out a little! I’ve got a lot of games I can report back on too, if needed! :)

[-] brucethemoose@lemmy.world 4 points 6 days ago* (last edited 6 days ago)

Thanks! Though it doesn’t mean much without a windows reference :P

I’m pushing my poor 3090 to 4K with just RT reflections but a bunch of mods, and I’m generally getting over 60 with no framegen (which is my target).

FYI I found the game actually looks better with most of the RT disabled:

  • RT shadows tend to be blocky and flicker, while raster shadows “miss” more shadows but are razor sharp and stable.

  • RT lighting is neat for, say, reflecting a neon billboard, but I find it often clashes with built in raster lighting. For instance, it turns neon signs into blobs and messed up the Arasaka atrium in the intro.

  • RT reflections look incredible, especially in rain. No downside as far as I can tell.

  • Path tracing is a whole different ballgame my card can’t handle. But (when modded to fix it) it’s apparently extra incredible, and basically disables all the other in game settings.

Check out the digital foundry video too, which shows some of this

[-] LucidNightmare@lemmy.dbzer0.com 2 points 6 days ago

Good point about the Windows reference!

I will boot into Windows when I can and see the performance there I’ll report back after I run around the city and outside the city for a little bit!

I am curious to try out NexusMods Linux compatibility with their new modding app, so I haven’t gotten to mod the game yet. I wasn’t going to play through it again (4th playthrough lol) just yet.

I just remember in the “cutscenes” like driving with Panam or Takamura, the RT looking better than the baked lighting. My 2080ti on Windows wasn’t able to handle that all the time (less than 60 with medium RT, no DLSS) but the way the “cutscenes” looked was just so much better with RT on that as soon as they started, I’d turn it on. :O

[-] brucethemoose@lemmy.world 2 points 6 days ago

Its RT reflections are doing most of the lifting driving around, I think, but they only take like 1/3 the FPS, while RT lighting and shadows are more subtle.

The settings may have been different in the past, can’t remember… I was playing on a laptop 2060, heh.

Thanks! I am curious, though I am glad to hear RT and such works well on Linux.

[-] GreenCrunch@lemmy.today 5 points 6 days ago

I am not sure, as I've actually only played it under Linux. I have a laptop with an RTX 3070. It's able to handle the raytraced low setting at 1080p, but I just run High instead so that the fan isn't as loud. And in my opinion that even looks pretty good. I might try start it under windows and run its benchmark because I'm curious now! I'll update here if I remember to do this test.

[-] brucethemoose@lemmy.world 2 points 5 days ago

Also, you might be able to fix that!

I clock limit my 3090 to like 1700MHz-1750Mhz with Nvidia-smi (built into the driver) since any faster is just diminishing returns. You might check what “stable clocks” your 3070 runs at, and cap them slightlt lower, and even try an under volt as well.

Be sure to cap the frame rate too.

Do that, and you might be able to handle RT reflections and otherwise similar settings without much noise. The hit for just that setting is modest on my 3090 but much heavier with full “low” RT

[-] GreenCrunch@lemmy.today 1 points 3 days ago

I'll have to look into seeing if I can mess with that! It's a laptop 3070, so they:'ve already made some changes (fewer cores, lower boost clocks). My laptop sets a 100 W max TGP for it.

TBH though I've found myself caring more about the convenience of playing games (comfort, portability, ease of interrupting) more than graphics settings. Yeah it's very pretty with ray tracing and all, but I'm totally fine with playing on medium or high.

Thanks for the ideas! Hopefully I can push the graphics up without turning into a pile of lava. I need to figure out how to record graphics power consumption for me to reference to evaluate changes.

[-] brucethemoose@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

Thanks for the ideas! Hopefully I can push the graphics up without turning into a pile of lava. I need to figure out how to record graphics power consumption for me to reference to evaluate changes.

It's far more efficient to just TDP limit your GPU rather than lowering settings to try and get power consumption (and laptop fan speed) down. It will stick to slightly lower clocks, which is exponentially better since that also lowers voltage, and voltage increases power consumption quadratically.

Otherwise it will always try to boost to 100W anyway.

You can do this with MSI Afterburner easily, or you can do it in Windows with just the command line. For example, nvidia-smi -pl 80 will set the power limit to 80W (until you restart your PC). nvidia-smi by itself will show all its default settings.

I do this with my 3090, and dropping from the default 420W to 300W hardly drops performance at all without changing a single graphics setting.

Alternatatively you can hard cap the clocks to your GPU's "efficient" range. For my 3090 thats somewhere around 1500-1700 MHz, and TBH I do this more often, as it wastes less power from the GPU clocking up to uselessly inefficient voltages, but lets it "power up" for really intense workloads.

FYI you can do something similar with the CPU too, though it depends on the model and platform.

[-] GreenCrunch@lemmy.today 2 points 3 days ago

Thank you very much, kind graphics wizard. I will put this knowledge to good use saving my ears from that fan. This is exactly what I was looking for!

[-] kadup@lemmy.world 4 points 6 days ago

With path tracing it runs significantly worse than it does on Windows. Without it, it runs roughly the same. RTX 4060 Ti.

[-] brucethemoose@lemmy.world 2 points 5 days ago

Awesome, thanks!

[-] monotremata@lemmy.ca 4 points 6 days ago

I think there's huge variability, but as a gross overgeneralization AMD gpus run Cyberpunk 2077 a bit faster on Linux than Windows, and nVidia gpus run it a bit slower on Linux than on Windows.

If you've got a spare usb hard drive you could always install Linux there for a test drive though. You might be able to find a setup that gets you the extra performance you're looking for.

[-] brucethemoose@lemmy.world 4 points 6 days ago* (last edited 6 days ago)

I already dual boot CachyOS! In fact I spent a lot of time tweaking schedulers, power, undervolting the GPU and such for compute performance, but I think it’s well tuned for gaming too.

It’s just annoying because I beat the GPU into submission with tons of settings (as Nvidia is funny with Wayland), so its display out is totally disabled. It’s a lot to undo.

[-] monotremata@lemmy.ca 3 points 6 days ago

See, that makes it sound to me like you could probably come up with a setup that would do what you want, but that doing so would probably mean making it worse at some of the other things you currently use it for.

Which is where using an external drive for a third installation might be easier. Or at least easier to dispose of if you get sick of the project. But I am perhaps unusually lazy in that regard.

[-] brucethemoose@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

You raise an excellent point.

TBH I am both lazy, and a bit paranoid/afraid of dealing with Nvidia rendering issues (even if using my IGP for desktop work), but it would probably be fine and I'm... just being lazy and paranoid.

I don't think it would make it worse for compute work.

An external 3rd partition does sound appealing, though one quirk is that CP2077 does really like SSDs. I have a slow external SSD, but it still might muddy an A/B test.

[-] BombOmOm@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

If you have a desktop, these work great for swapping SSDs out. Get a pair and swap them out whenever you need/want to. You just need a spare x4 (or larger) PCI-e slot, which is pretty common to have. (Technically they work fine with a x1 slot, but then you are slowing the SSD down.)

[-] brucethemoose@lemmy.world 2 points 6 days ago

I’ve got an ITX mini PC with both nvme and the graphics slot filled, heh.

Anyone know if CP2077 runs better on Linux than Windows?

That's entire dependent on a whole host of things. CPU, GPU, distro (mostly kernel version), open source vs proprietary drivers, proton version etc. Also some numbers can artificially look better if the feature is just straight up ignored by proton, or just broken. If you're looking for some bleeding edge features then probably not.

[-] brucethemoose@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

7800X3D, Nvidia 3090, CachyOS, the latest arch kernel with whatever tweaks they have, I assume git Proton and all the distro's riced settings. On CP2077's side I’d like RTX reflections and DLSS as the only exotic settings, though I did run a mod that hacks in FSR 3.1 framegen.

I realize I probably have to test this myself, heh. But from what I gather (and past experience on a laptop 2060 with Linux) is that Nvidia is disadvantaged on Linux in this scenario.

this post was submitted on 04 Aug 2025
881 points (98.4% liked)

PC Gaming

12002 readers
476 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS