Comment on Indie Game Awards Disqualifies Clair Obscur: Expedition 33 Due To Gen AI Usage
rtxn@lemmy.world 12 hours ago
Horrid article.
From the IGA’s statement:
[…] the use of gen AI art in production […] does disqualify Clair Obscur: Expedition 33 from its nomination. While the assets in question were patched out and it is a wonderful game, it does go against the regulations we have in place.
Rakqoi@piefed.blahaj.zone 9 hours ago
I think you have missed the actual issue here. The issue is not whether or not the game currently contains AI assets, the issue is whether AI was used during development. Quoting the article (emphasis mine):
The actual problem is that simply using generative AI during development disqualifies a game from being nominated, and Sandfall Interactive lied and said they did not use gen AI.
lepinkainen@lemmy.world 7 hours ago
So any game whose developer has used a recent version of VSCode will be disqualified in the future? VSCode has a GenAI autocomplete turned on by default.
One single question about an API to ChatGPT and your game is out.
Use Photoshops generative features for a marketing asset: out.
You get how insane the rule is?
You can only qualify it you write your game in vanilla vim with no extensions and graphics must be drawn in an old version of Gimp? 😆
tomalley8342@lemmy.world 2 hours ago
So insane that people have to go back to the primitive workflows of… 2021 🤣
lepinkainen@lemmy.world 1 hour ago
The problem is that Chinese, Indian and Turkish developers couldn’t care less about western AI purity tests and will blast past any competition who does.
Unless Captain AI Planet stops them, the cat is out of the bag and not going back.
If you want to run a AAA live service game the amount of content you put out every X weeks is how you make money. And the one who can keep up the best amount/quality ratio will always win.
The average gamer won’t care if the latest Gooner F2P or FPS game DLC is AI generated, AI assisted or lovingly hand crafted. They’ll throw their money at it anyway…
rtxn@lemmy.world 9 hours ago
The issue is not that the game was disqualified. If the rules clearly and unequivocally state that at no point can generative AI be used (and also clearly state what, in the spectrum from algorithms -> machine learning -> chatbot slop, they consider to be unacceptable, which I don’t know if they did or not, but that’s not the point), then there is no controversy, and I’m not criticising that.
The issue is that the article completely disregards mitigating facts that counter the narrative. There are no credible sources linked in the article save for one that was grossly misrepresented. Critically, we don’t know what Sandfall actually said before the nomination or after, or how the decision to disqualify was made, only the second-hand account in the FAQ. The article presents circumstances in a biased way, leading the reader to interpret it with the assumption that there are AI-generated assets currently in the game. It is, frankly, sloppy journalism.
Maestro@fedia.io 7 hours ago
Do you know where those rules are? I'm genuinly interested in where exactly they draw the line. I constantly see people ranting about gen AI when used for art, but even simple, basic code autocomplete is AI under the hood these days. I can't imagine developers not using autocomplete.
rtxn@lemmy.world 7 hours ago
The rules are on the same page I linked (www.indiegameawards.gg/faq), under the “Game Eligibility” tab. Regarding AI, the document contains a grand total of one sentence:
I’m assuming the definition of what that entails is “at their discretion”, meaning whatever they feel like at the moment. I see that sentiment reflected in this thread too.
Unfortunately the boundary between “AI” and “not AI” is the polar opposite of sharp and well-defined. I’ve used Allegorithmic Substance Designer a lot for CGI work (before Adobe ate the devs; fuck Adobe, all my homies hate Adobe), and it contains a lot of texture generator algorithms from simple noise to complex grunge textures. Things like Perlin noise and Voronoi diagrams are well-known algorithms. What about an algorithm that uses real-world samples? Machine learning is not the same as AI, so is that allowed? Where’s the line? I’m reasonably certain that everybody has a different answer based on different criteria.
I gave them the benefit of doubt and assumed that they had defined the exact line of what is and isn’t allowed, but apparently I was wrong.