Yeah I mean, that sounds reasonable. There is a big difference between generating all your game assets with AI and using Claude to refactor methods and write docs.
Big difference but I would argue both require disclosure because I will opt out of any of it. Add it to the long list of bullshit in the gaming industry I will not condone with my money.
The problem is that it's unenforcable. I bet that's one of the reasons valve is rephrasing.
Even pure AI art is unenforceable unfortunately. Like any form of cheating, some will be amateurish and obvious. But others will be sophisticated, skilled, and will simply blend into a gray area where you can't easily define a line.
How much "AI tool assistance" does it take before it's called "AI generated content"? It's totally arbitrary, and in many cases it's going to be completely unenforceable.
That doesn't mean it has no value, but it does mean it's not a silver bullet and no amount of tweaking is going to make it one. We can quickly use it to take out the obvious slop, the well-crafted examples will pass beneath anyone's notice, and when examples fall into the gray area we'll all bounce around inside with arguments about who we believe and how much is normal and acceptable until we eventually reach an arbitrary, per-game consensus, or maybe adjust the "rules" a little to accommodate them, but nothing really changes, we'll probably be arguing about whether games contain "too much AI" for decades now and there will never be a clear solution or answer.
Sometimes it is, sometimes it's not. Better to make the rule and enforce it where they can than to just forget about it. Maybe some honest devs will disclose it.
Good luck finding a dev that doesn't want to use/ isn't forced to use / doesn't lie about using AI tools.
Ar this point if we're to shun all AI tools we might just give up the hobby.
While on principle I don't care about people using llms to refactor code in my games, I still think that the AI is inevitable narrative is a bit jarring and that study in particular has a huge conflict of interest issue.
Ar this point if we're to shun all AI tools we might just give up the hobby.
There's plenty of good games made before this gen AI nonsense started appearing.
nothing is stopping me from making my own games without AI
It will be worse in the future, because young people growing up with Ai will find it 100% acceptable. Not everyone off course...
You'll need to opt out of pretty much anything digital than because almost every business is has employees using AI is some form or fashion since it's shoved down everyone's throats so hard.
Anything that I find that's digital and uses AI, I do opt out of, thank you.
I called an HVAC company several weeks ago and they had an AI agent answer the phone. I hung up and called someone else. No problem.
You better stop using lemmy or your lemmy client then.
Odds are astronomically high that they've used AI at some point on its development.
Just uninstall all games made after 2022 then, because I can assure you llm' have been used for code in some capacity in every game. But I would argue there is a big difference in using AI for assert generation. And using it to help read docs or getting ideas for refactoring some code etc
Can I ask why you think that? AI has stolen code and art and is regurgitating both without any credit or attribution to the originators. What makes art different from code in your opinion?
I found the concept of stolen code to be a bit weird. Code isn't poetry, there is a correct way of doing things and then there is incorrect ways of doing things.
If everybody does things the correct way then the code will be the same for any given problem. So is it stolen?
It's rather like how it's almost impossible to play any set of chords and them not be from some prior work. It doesn't mean that the music was stolen it just means that there is a limited number of ways you can combine notes and if you further limit it to combinations that sound good the set is even smaller.
If I may ask, what is your opinion on AI music? Do you think AI music is fine since there's a set number of chords and the AI is just combining them in a statistically plausible way?
Obviously I can't really answer that question. It's nuanced I can't give a black and white answer.
Notice I said chords, music isn't just chords though. I mentioned it because they have been copyright cases where people have tried to claim that they can own certain chords or certain chord progression, the courts have decided that isn't the case. You can own the composition but not the progression.
AI music is an entire piece, theoretically an original piece, you could of course make the arguement that it's just cutting it up bits of pre-existing work and sticking them back together but you could also make that arguement of a human as well. Copyright law isn't really fit for the 21st century and it certainly isn't fit to deal with the existence of AI, but that's nothing new. I can go online right now and find music that sounds like the Imperial March, is that copyright violation? The courts don't think so.
There is a big difference, and I’d argue the Claude refactoring is worse. Content was already pursuing the common denominator. But open source was a place where you could actually bring some nuance, examine things in detail, and build a shared understanding of deeper truths. But why bother with the icky social factors of working together to build something with people all around the world that can evolve and last for 10+ years, when you can boil a swimming pool to produce a half-baked one-off solution instead?
Because it'll be half-baked and one-off.
Jesus christ can we please stop trying to automate art and content creation
As long as it is used as a tool that has human refinement as part of the process it would be comparable to CGI replacing background matte paintings and motion capture replacing manual manipulation of CGI to make movement. Gollum worked because of the blend of technologies that were replacing existing practices, but as a new approach and not a cost cutting measure.
The problem is entirely about using the output directly as a replacement for humans to cut costs, not having another tool that can be used as part of a process. This is coming from someone who absolutely hates LLM and genAI slop, which is taking the horrible approach.
That is, in fact, not the problem in its entirety.
The move to CGI didn't require stealing the artwork of the matte painters. The move to mocap didn't require raping the land of all its water. The move to either didn't require all of the world supply of computing power, leaving it only affordable for the world's richest. The move to either didn't create a corporate circle jerk that the damn near whole world economy was propped up by.
So the thing is, the server farms for doing Toy Story and ILM server farms for rendering CGI were and are massive, but they were built over time and focused on a particular purpose. The same thing can be done with localized models of LLMs and genAI.
The giants who are buying all the parts and choosing to strain the grid and add polluting energy methods in order to stuff absolutely everything into massive all in one models are a related, but distinct issue, with different goals that encourage shitty practices. Yes, if a game company used the LLM and genAI slop producers tools that is a negative. If they use homegrown or at least dedicated models that are closer in scale to what is already used for CGI then it isn't automatically a negative.
It is like advertising. A little is fine, because awareness is needed for people to know something exists. Massively invasive methods of jamming advertising into literally every moment of the day is a problem. ChatGPT and OpenAI are the latter and a problem. Or how Nestlé doing literally anything is horrible even though other companies do the same thing without being nearly as horrible.
I would prefer to know what AI tools were used, so that I can avoid the ones using the AI slop machines that are a negative.
I mean, if were talking about efficiency tools for artists to use rather than straight up automatically generating the assets (not sure what those tools would be at the moment, but I've not been following what the AI industry actually releases for awhile because it's always seemed a bit useless), then the result should be an increase in the output of those artists rather than replacing them with statistical amalgam.
Tools like SpeedTree won‘t get dropped because the efficiency gains are enormous and the downside negligible.
Boooo
And there's the crack in the door needed to slither the rest of the way in. First they come for the copy writers and concept artists. Then it'll be writers of "boilerplate" code (I know it's already happening). Soon enough the disclosure will quietly disappear and we'll be completely drowned in slop.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)