A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
After GOTY pull, Clair Obscur devs draw line in sand over AI: 'Everything will be made by humans by us'
-
Well, you see, the "everyone" you are referring to are the same stupid masses that already don't deserve respect on the macro level. The same stupid masses voting in politicial officials like Donald Trump.Conflating multiple things together, are we? But hey, everyone who doesn't share your exact world view on all issues is a Nazi!
-
Yes, it is indeed embarrassing that you insist on defending AI. My sister insisted on using an AI this year to generate our secret santa. She didn't get a gift. Ahahahaha. AI is strictly for stupid people that can't do good work on their own. No reasonably intelligent person is using AI to make a draft that they can then correct, because it will always be more practical to just make it yourself.I mean, one of us is "defending" use of a technology that helps people, the other is publicly bashing their sister, implying she's stupid (which might or might not be true, not the point). I'll leave the conclusion to other readers, I have no more to say to a person like you.
-
I'm not going to fault you for that - but do you think you should receive an award for the work you didn't do? In the case of this particular game, perhaps the bulk of the creative work was done by humans. But if the GOTY committee isn't confident with drawing a line between what work is OK to offload onto an AI and what work isn't, then I think it's fair for then to say this year that any generative AI use is a disqualifier. You and I can say with ease that an implementation of a basic swap function (c=a, a=b, b=c) doesn't require any creative work and has been done to death, so there's no shame in copypasteing something from stackoverflow or chatgpt into your own code to save time. But it's harder to gauge that for more complex things. Especially with art - where would you draw the line? Reference material? Concept art? Background textures or 3d models of basic props (random objects in the scene like chairs, trees, etc)? I don't think there's a clear answer for that. You might have an answer you think is correct, and I might have one as well, but I think it will be difficult and time consuming to achieve consensus in the game development community. So, the most efficient answer for now is to have any generative AI be a disqualifier.I generally don't write code to get an award, it's just a tool to implement a business requirement. Sure, the committee can decide that absolutely no AI can be used, doesn't mean I can't call it out as stupid because I disagree with it. They're right to do that and I can not like it. It's not like I'm sending death threats or anything, I just stated my opinion. I wouldn't draw the line at all. If you just tell AI "give me a texture for this and that" it will look like shit most of the time. You either have to be lucky or edit it anyway. Let alone if you want consistency across multiple textures/models/whatever. So, if the end result is good, I don't particularly care. I never claimed my answer is the One True Answer
and I obviously think my answer is correct as does everyone else who has an opinion. And I'm pretty sure the consensus can only be one of two things: either the AI companies crash because it's financially unsustainable to provide the services or every game company uses it to generate various amounts of code/models/textures etc.
I don't think having AI be disqualified is the most efficient answer at all, nor do I think it's an efficiency issue at all. The companies don't care about that. They pay analysts to tell them which of these decisions will bring them more money. For now the analysts decided to play it safe and disqualify it. One day one young analyst with something to prove will say "well, everyone's disqualifying it, why don't we try a different way" and slowly everyone's gonna allow it. IMO, anyway.
-
Conflating multiple things together, are we? But hey, everyone who doesn't share your exact world view on all issues is a Nazi!
-
I mean, one of us is "defending" use of a technology that helps people, the other is publicly bashing their sister, implying she's stupid (which might or might not be true, not the point). I'll leave the conclusion to other readers, I have no more to say to a person like you.More-so I'm concerned for my sister, but yes, I did "publicly" bash her. On lemmy. Lol. The absurdity of your moral outrage is astounding. My sister is not above criticism because I care about her. But don't worry, you didn't raise any good points and you failed to deflect away from my poignant anecdote. The readers will indeed side against you, because you have failed to produce a sound argument.
-
Dude, go touch grass, please. This is embarrassing.
-
Oh, someone did read a Wikipedia article! No, it was not "appeal to popularity," it was meant to show that you'd be hard to find a product where at least small part wasn't generated with AI, meaning singling this company out because they admitted it is bullshit.
-
Yeah the AI slop hell that gives us terrible slop like expedition 33. Shudder at the thought
-
I wouldn't call it a genius either, it's just all over the place. Sometimes it's scary good and predicts your next move, most of the time it's just okay, sometimes it's annoyingly badMy last job was on a fairly large typescript codebase (few hundred Klocs) which we started some time before LLMs were a thing. While we weren't into academic engineering patterns and buzzwords, we were very particular in maintaining consistent patterns across the codebase. The output of Copilot, even with early models which were far from today's standards, was often scarily accurate. It was far from genius but i'm still chasing that high to this day, to me it really indicated that we had made this codebase readable and actionable even by a new hire.
-
But i was responding to a specific comment with well enough context to understand what i mean. If something's unclear about it you can ask about it in a non passive agressive way and maybe i can help you out.
-
When you sample in music, you get the original artist's permission or you get fucking sued. If the AI used were trained on a licensed library catalogue, then sure. Media companies historically would buy sample licenses to use for their sound effects in movies, video games, etc. so AI could essentially just do that, but put the encyclopedia of samples in a blender of training to modulate that shit to make something somewhat "new" to be used. Original artists get royalties, users get something customized without having to hire sounds engineers to make those adjustments, and consumers get good products.>When you sample in music, you get the original artist's permission or you get sued Not if it's short enough and is used in an otherwise entirely new work/remix.
-
You sound like a sane person. Here you go, a lie as you wished.
-
So, correcting your nonsense is "ad hominem" now, huh? Newsflash: If you're wrong and someone corrects you, it's not ad hominem. And what the hell is the nonsense about moving the goalpost? Do you read what you type or do you just tell an AI to write your comments for you?
-
So, correcting your nonsense is "ad hominem" now, huh? Newsflash: If you're wrong and someone corrects you, it's not ad hominem. And what the hell is the nonsense about moving the goalpost? Do you read what you type or do you just tell an AI to write your comments for you?
-
You sound like a sane person. Here you go, a lie as you wished.