A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
After GOTY pull, Clair Obscur devs draw line in sand over AI: 'Everything will be made by humans by us'
-
If there was an AI that licensed every bit of art/code/etc that it trained on, then I think I would be fine if they used it. BUT, I’d never think their final product was ever a more than just a madlibs of other people’s work, cobbled together for cheap commercial consumption. My time is worth something, and I’m not spending a minute of it on AI generated crap when I could be spending it on the product of a true author, artist, coder, craftsman. They deserve my dollar, not the AI company and their ai-using middle man who produced shit with it.I mean they mostly used it for textures, right? Those are often generated from a combination of noise and photography, it's not like they were building the game out of lego bricks of other people's art. I don't see how it's significantly different than sampling in music, it's just some background detail to enhance the message of the art. Obviously modern AI is a nightmare machine that cannot be justified, but I could imagine valid artistic uses for the hypothetical AI you described.
-
Uh... Sorry but no, LLMs are definitely fast enough. It works just like auto complete, except sometimes it's a genius that pulls the next few lines you were about to write out of the ether, and sometimes it makes up a library to do something you never asked for Mostly it works about as well as code completion software, but it'll name variables much betterI believe you that genAI is being used for code suggestions, but I wouldn’t call it a genius. This is anecdotal but over the last couple of years I’ve noticed Visual Studio’s autocomplete went from suggesting exactly what I wanted more often than not to just giving me hot garbage today. Like even when I’m 3-4 lines in to a very obvious repeating pattern it’ll suggest some nonsense variation on it that’s completely useless. Or just straight making up variable and function names that don’t exist, or suggesting inputs to function calls that don’t match the signature. Really basic stuff that any kind of rules-based system should have no problem with.
-
This post did not contain any content.
-
Just to point out, LLMs are genAI. Lots of code editors provide code suggestions similar to autocorrect/text suggestions using AI. Strictly I doubt any game is made without AI. Not to say it can't be deliberately avoided, but given the lack of opposition to GPT and LLMs I don't see it being considered for avoidance in the same way as art. So Awards with constraints on "any AI usage in development" probably disqualifies most modern games.
-
This post did not contain any content.I'm not surprised they ultimately felt like GenAI isn't useful to what they're trying to do. Game dev has known about this type of generation for a while now (see https://en.wikipedia.org/wiki/Model_synthesis and the sources linked at the bottom) and it takes a lot of human effort to curate both the training data and the model weights to end up with anything that feels new _and_ meaningful. If I shuffle a deck of 52 cards, there is [a high chance](https://math.stackexchange.com/questions/671/when-you-randomly-shuffle-a-deck-of-cards-what-is-the-probability-that-it-is-a) of obtaining a deck order that has never occurred before in human history. Big whoop. GenAI is closer to [sexy dice](https://www.scribd.com/document/605580945/Sexy-Dice-Game-Printables) anyways - the "intelligent work" was making sure the dice faces always make sense when put together and you don't end up rolling "blow suck" or "lips thigh". It's very impressive that we're able to scale this type of apparatus up to plausibly generate meaningful paragraphs, conversations, and programs. It's ridiculous what it cost us to get it this far, and just like sexy dice and card shuffling I fail to see it as capable of replacing human thought or ingenuity, let alone expressing what's "in my head". Not until we can bolt it onto a body that can feel pain and hunger and joy, and we can already make human babies much more efficiently than what it takes to train an LLM from scratch (and _they_ have personhood and thus rights in most societies around the world). Even the people worried about "AI self-improving" to the point it "escapes our control" don't seem to be able to demonstrate that today's AI can do much more than slow us down in the long run; [this study] (https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/) was published over 5 months ago, and they don't seem to have found much [since then](https://metr.org/research/).
-
VSCode autocomplete is literal AI now, they don’t have intellisense Others will follow, making “no AI” software really difficult to prove
-
Just to point out, LLMs are genAI. Lots of code editors provide code suggestions similar to autocorrect/text suggestions using AI. Strictly I doubt any game is made without AI. Not to say it can't be deliberately avoided, but given the lack of opposition to GPT and LLMs I don't see it being considered for avoidance in the same way as art. So Awards with constraints on "any AI usage in development" probably disqualifies most modern games."Ai" does not exist. There is no such thing. It does not exist at presence. Full stop. Genetic algorithms and complex algorithms are neat. They're useful tools. They're not "ai", they're algorithms. It's fine. Large X models and diffusion models are bullshit machines. They were invented as bullshit machines. They are scams. Their evangelists are scammers. They do not 'generate' except in the way aspark gap jammer 'generates'. They're noise machines. You do not know what the fuck you're talking about. You are a cultist. Go back to sexting with ELIZA.
-
I believe you that genAI is being used for code suggestions, but I wouldn’t call it a genius. This is anecdotal but over the last couple of years I’ve noticed Visual Studio’s autocomplete went from suggesting exactly what I wanted more often than not to just giving me hot garbage today. Like even when I’m 3-4 lines in to a very obvious repeating pattern it’ll suggest some nonsense variation on it that’s completely useless. Or just straight making up variable and function names that don’t exist, or suggesting inputs to function calls that don’t match the signature. Really basic stuff that any kind of rules-based system should have no problem with.I wouldn't call it a genius either, it's just all over the place. Sometimes it's scary good and predicts your next move, most of the time it's just okay, sometimes it's annoyingly bad
-
I mean they mostly used it for textures, right? Those are often generated from a combination of noise and photography, it's not like they were building the game out of lego bricks of other people's art. I don't see how it's significantly different than sampling in music, it's just some background detail to enhance the message of the art. Obviously modern AI is a nightmare machine that cannot be justified, but I could imagine valid artistic uses for the hypothetical AI you described.When you sample in music, you get the original artist's permission or you get fucking sued. If the AI used were trained on a licensed library catalogue, then sure. Media companies historically would buy sample licenses to use for their sound effects in movies, video games, etc. so AI could essentially just do that, but put the encyclopedia of samples in a blender of training to modulate that shit to make something somewhat "new" to be used. Original artists get royalties, users get something customized without having to hire sounds engineers to make those adjustments, and consumers get good products.
-
When you sample in music, you get the original artist's permission or you get fucking sued. If the AI used were trained on a licensed library catalogue, then sure. Media companies historically would buy sample licenses to use for their sound effects in movies, video games, etc. so AI could essentially just do that, but put the encyclopedia of samples in a blender of training to modulate that shit to make something somewhat "new" to be used. Original artists get royalties, users get something customized without having to hire sounds engineers to make those adjustments, and consumers get good products.
-
Bullshit. “Everyone” in your comment was a clear “appeal to popularity” argument, made to _other_ anyone not on the bandwagon. It deserves to be called out for what it is.Oh, someone did read a Wikipedia article! No, it was not "appeal to popularity," it was meant to show that you'd be hard to find a product where at least small part wasn't generated with AI, meaning singling this company out because they admitted it is bullshit.
-
Well, you see, the "everyone" you are referring to are the same stupid masses that already don't deserve respect on the macro level. The same stupid masses voting in politicial officials like Donald Trump.Conflating multiple things together, are we? But hey, everyone who doesn't share your exact world view on all issues is a Nazi!
-
Yes, it is indeed embarrassing that you insist on defending AI. My sister insisted on using an AI this year to generate our secret santa. She didn't get a gift. Ahahahaha. AI is strictly for stupid people that can't do good work on their own. No reasonably intelligent person is using AI to make a draft that they can then correct, because it will always be more practical to just make it yourself.I mean, one of us is "defending" use of a technology that helps people, the other is publicly bashing their sister, implying she's stupid (which might or might not be true, not the point). I'll leave the conclusion to other readers, I have no more to say to a person like you.
-
I'm not going to fault you for that - but do you think you should receive an award for the work you didn't do? In the case of this particular game, perhaps the bulk of the creative work was done by humans. But if the GOTY committee isn't confident with drawing a line between what work is OK to offload onto an AI and what work isn't, then I think it's fair for then to say this year that any generative AI use is a disqualifier. You and I can say with ease that an implementation of a basic swap function (c=a, a=b, b=c) doesn't require any creative work and has been done to death, so there's no shame in copypasteing something from stackoverflow or chatgpt into your own code to save time. But it's harder to gauge that for more complex things. Especially with art - where would you draw the line? Reference material? Concept art? Background textures or 3d models of basic props (random objects in the scene like chairs, trees, etc)? I don't think there's a clear answer for that. You might have an answer you think is correct, and I might have one as well, but I think it will be difficult and time consuming to achieve consensus in the game development community. So, the most efficient answer for now is to have any generative AI be a disqualifier.I generally don't write code to get an award, it's just a tool to implement a business requirement. Sure, the committee can decide that absolutely no AI can be used, doesn't mean I can't call it out as stupid because I disagree with it. They're right to do that and I can not like it. It's not like I'm sending death threats or anything, I just stated my opinion. I wouldn't draw the line at all. If you just tell AI "give me a texture for this and that" it will look like shit most of the time. You either have to be lucky or edit it anyway. Let alone if you want consistency across multiple textures/models/whatever. So, if the end result is good, I don't particularly care. I never claimed my answer is the One True Answer
and I obviously think my answer is correct as does everyone else who has an opinion. And I'm pretty sure the consensus can only be one of two things: either the AI companies crash because it's financially unsustainable to provide the services or every game company uses it to generate various amounts of code/models/textures etc.
I don't think having AI be disqualified is the most efficient answer at all, nor do I think it's an efficiency issue at all. The companies don't care about that. They pay analysts to tell them which of these decisions will bring them more money. For now the analysts decided to play it safe and disqualify it. One day one young analyst with something to prove will say "well, everyone's disqualifying it, why don't we try a different way" and slowly everyone's gonna allow it. IMO, anyway.
-
Conflating multiple things together, are we? But hey, everyone who doesn't share your exact world view on all issues is a Nazi!
-
I mean, one of us is "defending" use of a technology that helps people, the other is publicly bashing their sister, implying she's stupid (which might or might not be true, not the point). I'll leave the conclusion to other readers, I have no more to say to a person like you.More-so I'm concerned for my sister, but yes, I did "publicly" bash her. On lemmy. Lol. The absurdity of your moral outrage is astounding. My sister is not above criticism because I care about her. But don't worry, you didn't raise any good points and you failed to deflect away from my poignant anecdote. The readers will indeed side against you, because you have failed to produce a sound argument.
-
Dude, go touch grass, please. This is embarrassing.
-
Oh, someone did read a Wikipedia article! No, it was not "appeal to popularity," it was meant to show that you'd be hard to find a product where at least small part wasn't generated with AI, meaning singling this company out because they admitted it is bullshit.