92
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 15 Sep 2024
92 points (83.8% liked)
Games
16851 readers
661 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 2 years ago
MODERATORS
We certainly can. NVIDIA’s CEO realizes that the next buzzword that sells their cards (8K, 240hz, RTX++) isn’t going to run at good framerates without it.
That’s not to say AI doesn’t have its place in graphics, but it’s definitely a crutch for extremely high-end rendering performance (see RT) and a nice performance and quality gain for weaker (hopefully cheaper) graphics cards which support it.
As a gamer and developer I sort of fear AI taking the charm away from rendered games as DLSS/FSR embeds itself in games. I don’t want to see a race to the bottom in terms of internal, pre-DLSS resolution.
With you there. The workload on developers is reduced with these features, to a degree. But, instead of saved effort then getting directed to working on gameplay mechanics and such, to me it feels like many devs just see it as time/money saved, producing a game that looks and plays like one from 10 years ago, but runs like it's cutting edge.
For instance, Abiotic Factor. That game on my RX 6800 XT runs at 40-50fps when at 100% resolution scaling at 1440p. Why? It's got the fidelity of Half Life 1, why does it need temporal upscaling to run better? (I adore that game btw, Abiotic Factor is so much fun and worth getting even if playing alone!)
Not saying that's how every dev is, I know there are plenty of games coming out nowadays that look and run great with creators that care. Just feels like there are too many games that rely on these machine learning based features too heavily, resulting in blurriness, smearing, shimmering, on top of poorer performance.
Just hoping the expectation that something like an RTX 4090 does not become the default cost-of-entry in order to play PC games because of this. It would be unfortunate for the ability of game developers to create and tune by-hand to become a lost art.
As a (non-game) developer, AI isn't even that great at reducing my burden.
The organization is enthusiastic about AI, so we set up the Gitlab Copilot plugin for our development tools.
Even as "spicy autocomplete" only about one time in 4 or so it makes a useful suggestion.
There's so much hallucination, trying to guess the next thing I want and usually deciding on something that came out of its shiny metal ass. It actually undermines the tool's non-AI features, which pre-index the code to reliably complete fields and function names that actually exist.
I was going to defend "well ray tracing is definitely a time saver for game developers because they don't have to manually fake lighting anymore." Then I remembered ray tracing really isn't AI at all... So yeah, maybe for artists that don't need to use as detailed of textures because the AI models can "figure out" what it presumably should look like with more detail.
I've been using FSR as a user on Hunt Showdown and I've been very impressed with that as a 2k -> 4k upscale... It really helps me get the most out of my monitors and it's approximately as convincing as the native 4k render (lower resolutions it's not nearly as convincing for ... but that's kind of how these things go). I see the AI upscalers as a good way to fill in "fine detail" in a convincing enough way and do a bit better than traditional anti aliasing.
I really don't see this as being a developer time saver though, unless you just permit yourself to write less performant code ... and then you're just going to get complaints in the gaming space. Writing the "electron" of gaming just doesn't fly like it does with desktop apps.
I know this is a bit late, but copilot is only ok if used for code completion. I switched to the free tier of supermaven a month ago and it's been way more helpful, as it can handle context better. Probably cuts coding in half and takes away a third of debugging.
Asking chatgpt for code has also become better, but imo still not reliable enough to regularly use. Just had some docker code written and it got it wrong 3 times so I gave up on that.
I get your point, AI can only save time if you know exactly what you're doing and it will only be helpful sometimes. But when it is, it's such a time saver.
Mostly it really is just a fancier auto-complete. It is most useful for situations where you want to essentially do the equivalent of copy&paste and then make changes in a few predictable places in each copy.
It is total crap at writing code itself to the point where you need to read the code and understand it to know it hasn't screwed up, something that takes much, much longer than just writing it yourself.
Yeah AI as a dev is shit, but AI as a more thoughtful auto-complete is actually pretty great.
To me it looks like AIs currently are right at the boundary between being a tool and being a companion. But to be a full companion, they can't be up against the boundary, they need to be well established and tried and tested as a companion to be used repeatedly, so we're still a few decades out from that from what I can tell.
This was my fear when they announced they got an AI to generate doom in real time with no code (well code for the AI but they got the game via a prompt) yeh its amazing but they were running oldschool doom at 20fps on top end hardware with additional AI hallucinations.
Next thing you know we'll be simulating an entire universe just to play bee simulator.