Seems a bit excessive.
There’s AI slop games, the new breed of asset flips.
And then there’s “a few of our textures were computer generated.”
Submitted 7 hours ago by tonytins@pawb.social to games@lemmy.world
https://insider-gaming.com/indie-game-awards-disqualifies-clair-obscur-expedition-33-gen-ai/
Seems a bit excessive.
There’s AI slop games, the new breed of asset flips.
And then there’s “a few of our textures were computer generated.”
For stuff like dirt/stone/brick/etc textures I’m less strict for the use of generative stuff. I even thinking having an artist make the “core” texture and then using an AI to fill out the texture across the various surfaces to make it less repetitive over a large area isn’t a problem for me.
Like, I agree that these things gernally are ethically questionable with how they are trained, but you can train them on ethically sourced data and doing so could open up the ability to fill out a game world without spending a ton of time, leaving the actual artists more time to work on the important set pieces than the dirt road connecting them.
And giving studios like this an edge over AAAs. It it’s the start of negating their massive manpower advantage.
In other words, the anti-corpo angle seems well worth the “cost” of a few generations. That’s the whole point of AI protest, right? It really against the corps enshittifying stuff.
Who made the textures or took the photos that them AI generated ones were derived from, do they get a cut? That justification is even more bizarre now, considering the tools we have to photoscan.
Let them have their award with their own rules.
Also what about AI code tools? Like if they use cursor to help write some code does that disqualify them?
If you do that and proceed to say “No we didn’t use any AI tools”. Then yes, that should be a disqualification.
“When it was submitted for consideration, representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33.”
Yeah.
A lot of devs may do it personally, even if it’s not a company imperative (which it shouldn’t be).
People have made it excessive due to turning AI into a modern witch hunt. Maybe if people had a more nuanced take than “all AI bad” companies could be more open about how they use AI.
I can guarantee that if E33 came out with the AI disclaimer it would’ve been far more controversial and probably less successful. And technically they should have an AI label because they did use Gen AI in the development process even if none of it was supposed to end up in the final game.
But we can’t have companies being honest because people can’t be normal.
“All genAI bad” is a nuanced take. When you look at genAI from a moral, ethical, or sociopolitical perspective it always demonstrates itself to be a net evil.
The core technology is predicated on theft, the data centers powering it are harmful economically and to surrounding communities, it is gobbled up by companies looking to pay less to profit more, and it’s powered by a bubble ripe for bursting which will wreak havoc on our economy.
GenAI is indefensible as a technology, and the applications it may have for any tangible benefit can probably be accomplished by ML systems not built on the back of the LLM monster. We should all be protesting its use in all things.
Its not surprising when even people who like AI are now being affected by consumer hardware prices that is leading to shift in previously positive perception of it.
People are being affected by it now on the consumer side so being hard to ignore its affects now. Gone from a philosophical difference to actually actual tangible consequences.
I have the same feeling about Kojima's and Vincke's latest comments on AI. Am I supposed to get mad at every single person who said they used/plan to use AI for something? I'd be as outraged as the average Fox News viewer, and it would be impossible to be taken seriously. I still won't be using AI myself (fuck surveillance state AI) and I'd be making every effort to encourage others not to use it, but there's no point in burning bridges and falling for rage bait.
They're creative people who care about the craft and care about the teams in their employ, which gives their statements weight, where some Sony/Microsoft/EA executive making an identical statement has none.
I understand the principle. Even if E33 is not “slop,” people should fear a road that leads to dependence on “surveillance state AI” like OpenAI.
That being said, I think a lot of people don’t realized how cheap and commoditized it’s getting. It’s not a monoculture, it’s not transcending. This stuff is racing to the bottom to become dumb tools, and honestly that’s something that makes a lot of sense for a game studio dev to want.
And E33 is clearly not part of the “Tech Bro Evangalism” camp. They made a few textures, with a tool.
Give it another 5 years maybe and local self-trainable models and alternative versions of it will be available that won’t have all the theft problems, surveillance problems and other issues. The tech is new and mainly controlled by giant companies right now.
It’s not like the tech is going to forever exist in a vacuum in the exact state. It’s in nothing ever does. Makes it doubly silly to get mad over a tool.
At the end of the day it's all about the quality in my opinion.
The entire game could be written by ONE passionate person who is awesome at writing the story and the code, but isn't good at creating textures and has no money for voice actors - in which case said textures and all the voices would be AI generated, then hand retouched to ensure quality. That would still be a good game because obvious passion went into the creation of it, and AI was used as a tool to fill out gaps of the sole debeloper's expertise.
A random software house automating a full on pipeline that watches various trends on TikTok, Facebook, YouTube, etc., and chains together various genAI models to create slopware games by the dozens, on the other hand, is undefendable. There's no passion, there's no spirit, there's just greed and abuse of technology.
Differentiation between the two is super important.
So is the source.
If they’re paying a bunch of money to OpenAI for mega text prompt models, they are indeed part of the slop problem. It will also lead to an art “monoculture,” Big Tech dependence, code problems, all sorts of issues.
Now, if they’re using open weights models, or open weights APIs, using a lot of augmentations and niche pipelines like, say, hand sketches to 3D models, that is different. That’s using tools.
I kinda feel like Clair Obscur is sort of stretching the definition of indie game.
I guess _technically _ it is.
I’m not saying every game needs to be made in someone’s garage and take 12 years to make, but it sounds like this game was completely funded by Kepler and parts of the game were outsourced to other companies. Sandfall is made up of experienced developers from places like Ubisoft. Kinda feels like Brad Pitt and Tom Cruise made their own movie with funding from a lesser known subdivision of Warner Bros, outsourced SFX to 300 animators, and called it indie because they filmed it with 10 people.
I do think Clair Obscur is a fantastic game and deserves to be Game of the Year (aside from the AI use). Sandfall and Kepler did a great job with a reported budget of $10M(!) and I especially appreciate what Kepler is doing to support the gaming industry.
I guess I see the point of the award to inspire people to believe they shouldn’t give up on their dreams by recognizing small teams making games outside of the traditional industry. I just don’t feel like Sandfall qualifies.
In the end, it’s not my award and they can give it to whoever they want!
I agree with your take. The definition of what an “indie” is is very vague and subjective, but given the budget and resources and circumstances of E33’s development it seems outside the scope of what seems to be the “spirit of the award”.
Blue Prince should have gotten the award to begin with.
Just a note, seems to just be in production. Possibly placeholders?
Reminds me of the old days, developers all the time put in copyrighted assets as placeholders. Rarely they get into the final release and cause trouble but it was fairly common practice.
The only takeaway is that the Indie Game Awards’ rule is overly restrictive. Woops, one of your contracted artists used a GenAI model to generate a music playlist to set the mood while he was working on your game, you’re disqualified and the fact that you didn’t come forward with this information immediately makes you a liar. Obviously absurd. If they’re going to take a strong anti-AI stance, it should be more realistic. At some point, maybe even already, every single competitor should be disqualified but isn’t aware or forthcoming about it, so what’s the rule actually doing except rewarding dishonesty?
The GenAI asset was in the final release. It wasn’t that a subcontractor used GenAI to create a music playlist to listen to while they worked. That’s a very different thing.
They replaced the art later, but shouldn’t the bar be high like this? Otherwise, the caution won’t be there. It also could be abused, like games only getting adjusted post-launch if a certain measure of success hits. Plus the final product is not the only part of matters in the was-AI-used discussion, it is also about the process. If AI is the product of stolen human artwork being fed into a machine, and then that machine is used during creation, then AI has been used in the process that led to the final product no less than the concept art that may not be seen in game but was important in steering the ship.
Maybe someone can share there thoughts though. I’m still formulating mine and this is where I am at the moment.
There is no use of Gen AI in an indie game that should be tolerated. Period.
That’s just not going to happen.
Nearly any game with more than a few people involved is going have someone use cursor code completion, or use one for reference or something. They could pull in libraries with a little AI code in them, or use an Adobe filter they didn’t realize is technically GenAI, or commission an artist that uses a tiny bit in their workflow.
In any game, not just indies.
If we’re banning games over how they make concept art… I’m not sure how you expect to enforce that. How could you possibly audit that?
Are you putting coding tools in this bucket?
I feel like this is virtue signaling more than actually addressing a real problem with Clair Obscur.
Welcome to the internet. No one knows each other, no one considers context, no one reads past the headline, everyone makes snap judgements based on half understood heuristics, and then rushes to the comments to grandstand. A job that could be trivially done by AI, and almost certainly is, but instead we’ll all pretend like we’re the last bastion of human sanity.
Yeah.
Maybe a technically too. The rule said “no AI,” and E33 used AI, though in hindsight making such a hard restriction with the intent of filtering slop games was probably unwise.
Like that the story is bifurcated and that the combat in the late game is parry or die?
Get good or lower the difficultly and stop crying
What are the odds that a mechanic introduced to you in the first combat is a required component of the game? Crazy.
I think you should probably give up on gaming. Doesn’t seem like your scene: it’s for people that have the ability to process information and learn from it.
I've never pirated a game, but if developers are going to use pirated content to make a game, they cant be mad when we pirate their game.
Seems a few “artists” disagree with you.
"make a dirt texture"
Clair Obscur: Expedition 33 launched with what some suspected to be AI-generated textures that, as it clarified to El País, were then replaced with custom assets in a swift patch five days after release.
Fuck using Gen AI to replace human-made art, and fair enough for pulling the award, but I do think it’s worth making it clear exactly how much of the art is/was AI. And the answer is, very little at launch and none currently.
AI wasn't used to "replace human-made art", though.
To me it sounds like the team needed generic textures in big batches, and instead of spending precious designer time on hand crafting them, AI was utilised to allow the designers to focus on actual art they enjoy. I'm a software engineer, not a designer, but if I were given the option to write 8000 classes that are almost the same, or write 5 classes that will take the same effort as the 8000, but actually require using my creative skills... I'd choose the latter, and offload the 8000 boilerplates to AI.
The fact that it was replaced with human made art so quickly suggests that the AI generated ones were meant to be placeholders only anyway.
That’s exactly the takeaway I got from it as well.
It seems most likely that those were placeholders that were supposed to be replaced before release but were missed. Once they realized that some were missing, they got them replaced and pushed the update.
GenAI being used for placeholder stuff is arguably the perfect use case, especially for small studios without massive art teams.
So instead of buying the textures they didnt want to create, they paid for AI to generate derived versions from stolen art??
Whats the point? Just give the artists the money directly.
This feels like unecessary absolutism and fear mongeting. I am by no means an AI lover, but people shouldn’t let the worst implimentations of something cloud their judgement.
I feel the question should be “Does this project use A.I responsibly?” not “Was A.I used?”
Maybe what we should be advocating for is transparency with these decisions?
Asking whether the project uses AI responsibly means you either need to define responsibly in a way that people can apply objectively, noting that everyone will have opinions about whether it’s a good definition. Or you leave it undefined and the answer basically means nothing.
Unless the model that they used was trained entirely on artwork that was public domain, creative commons, licensed or owned, then its basically certain that it wasn't used responsibly.
You cannot make something on a foundation of someone else's exploitation and be considered responsible, ethical, original or independent.
They used AI for placeholder art.
It could’ve been literally Mickey Mouse downloaded from google images and still be fine. None of it was supposed to be shipped.
Big AAA studios have a person or team whose only job is to make sure that doesn’t happen. Smaller studios don’t.
So instead of "Did you pirate this game?", the question should be "Did you pirate this game, responsibly?"
Well, I just pirate games. Fuck responsibility.
Horrid article.
From the IGA’s statement:
[…] the use of gen AI art in production […] does disqualify Clair Obscur: Expedition 33 from its nomination. While the assets in question were patched out and it is a wonderful game, it does go against the regulations we have in place.
I think you have missed the actual issue here. The issue is not whether or not the game currently contains AI assets, the issue is whether AI was used during development. Quoting the article (emphasis mine):
“The Indie Game Awards have a hard stance on the use of gen AI throughout the nomination process and during the ceremony itself,” the statement reads. “When it was submitted for consideration, representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33.
“In light of Sandfall Interactive confirming the use of gen AI on the day of the Indie Game Awards 2025 premiere, this does disqualify Clair Obscur: Expedition 33 from its nomination.”
The actual problem is that simply using generative AI during development disqualifies a game from being nominated, and Sandfall Interactive lied and said they did not use gen AI.
So any game whose developer has used a recent version of VSCode will be disqualified in the future? VSCode has a GenAI autocomplete turned on by default.
One single question about an API to ChatGPT and your game is out.
Use Photoshops generative features for a marketing asset: out.
You get how insane the rule is?
You can only qualify it you write your game in vanilla vim with no extensions and graphics must be drawn in an old version of Gimp? 😆
The issue is not that the game was disqualified. If the rules clearly and unequivocally state that at no point can generative AI be used (and also clearly state what, in the spectrum from algorithms -> machine learning -> chatbot slop, they consider to be unacceptable, which I don’t know if they did or not, but that’s not the point), then there is no controversy, and I’m not criticising that.
The issue is that the article completely disregards mitigating facts that counter the narrative. There are no credible sources linked in the article save for one that was grossly misrepresented. Critically, we don’t know what Sandfall actually said before the nomination or after, or how the decision to disqualify was made, only the second-hand account in the FAQ. The article presents circumstances in a biased way, leading the reader to interpret it with the assumption that there are AI-generated assets currently in the game. It is, frankly, sloppy journalism.
was it an indie in the first place?
I mean in the sense they are self published… Yes they are indie.
I thought they had a publisher
I heartily approve.
People are tearing eachothers up in the comments, but does anybody know what the textures were?
Just production assets from what I understand. None of it is in the game proper.
If this is the case then i dont care.
Placeholders and production assets are just random low effort garbage somebody puts together and are just meant for people internally see how things work. No artist lost paycheck or merit.
I dont see any difference in this than when Batman Arkham asylum was released with placeholders. Skyrim had also placeholder textures on the release, so did Halo 2, Fallout new vegas, CoD MW2 and probably hundreds of more. Only difference is that they were done by coders in paint or something instead of generative ai.
As much as I hate to admit it, non-flagrant AI use will likely become generally accepted. The truth is that there’s a lot of content in games these days that sometimes just isn’t that important to dedicate man hours to it (Ex: Generic brick texture #431). The fact that this slipped through the cracks is proof enough.
However, overly slacking to the point the end point looks obviously AI generated with just bad art. It’s pretty much akin to just delegating to some shady third party studio that works for pennies and spits out generic, low quality stuff. This will continue to be
Ethics and copyright, are of course, different questions entirely. (In my opinion most AI provides are committing blatant AI infringement by using machines that crunch down copyrighted data and resell it back to you). But it seems like Silicon Valley’s marketing and public relations team managed to figure out the copyright one at this point. <>/
Trained on stolen art of people who actually spent time making that brick texture?
Games are an artform, AI shouldnt be used at all.
You are all over this thread repeating variations of the same comment which, despite wildly different responses from voters, mostly show you do not at all understand how image model training and generation work.
This sort of absolutism is dead. Do you think they should be disqualified if they Google something and the answer is in Google’s AI summary?
If it’s trained on licensed material, would that be acceptable?
lol dumb disqualification. Game is overrated anyway
Insightful commentary, I’m glad you shared your thoughts with us.
I mean they got to give the other games a chance at winning something this year 😁
I hope this wasn’t used as an excuse to disqualify the game so they wouldn’t have to give it an award… Cause if it turns out it was, that would look really bad…
MITM0@lemmy.world 1 hour ago
It’s a AA game if anything