A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
After GOTY pull, Clair Obscur devs draw line in sand over AI: 'Everything will be made by humans by us'
-
This post did not contain any content.> "When AI first came out in 2022, we'd already started on the game. It was just a new tool, we tried it, and we didn't like it at all. It felt wrong." I'm willing to give them the benefit of the doubt on this one. I'm pretty hardcore anti-AI these days, but when it was just hitting the masses and it was the shiny new toy, I was ignorant about the specifics and tried it out here and there. So this specifically resonates with me. > Broche then drew a line in the sand. He mused that it would be hard to predict how AI might be used in the gaming industry in the future, and declared, "But everything will be made by humans, by us." I hope they stick to their word on this, but only time will tell in that regard.
-
>transformative use or transformation is a type of fair use that builds on a copyrighted work in a *different manner or for a different purpose from the original*, and thus does not infringe its holder's copyright. You can use a book to train an AI model, you can't sell a translation just because you used AI to translate it. These are two different things. Collage is transformative, and it uses copyrighted pictures to make completely new works of art. It's the same principle.Not a single line in your comment offers anything that machine generation, which is not at all human creative work, falls under fair use.
-
AI companies are the biggest data aggregator though and they indiscriminately scrape literally everything. I am personally completely against copyright and patent law specifically. But sometimes, like in this case, they can be necessary tools. There are probably better ways to protect against AI but none that are recognized in our current framework of how society functions. AI companies are literally stealing everything ever posted online, cause they couldn't exist without all the data, and then selling it back to people in form of tools while destroying the environment in the process with increasingly gigantic and powerhungry data centers. While also destroying the tech consumer market in the process by buying up components or straight up component producers and taking them off the consumer market.By data aggregators, I strictly mean websites like Reddit, Shutterstock, deviant Art, etc. giving them the keys would bring up the cost of building a state of the art model so that any open sourcing would be literally impossible. These models already cost in the low millions to develop. Take video generation for instance, almost all the data is owned by YouTube and Hollywood. Google wanted to charge 300$ a month to use it but instead, we have free models that can run on high end consumer hardware. Scraping has been accepted for a long time and making it illegal would be disastrous. It would make the entry price for any kind of computer vision software or search engine incredibly high, not just gen AI. I'd love to have laws that forced everything made with public data to be open source but that is *not* what copyright companies, AI companies and the media are pushing for. They don't want to help artists, they want to help themselves. They want to be able to dictate the price of entry which suits them and the big AI companies as well. I'm all for laws to regulate data centers and manufacturing, but again, that's not what is being pushed for. Most anti-AI peeps seem the be helping the enemy a lot more then they realize.
-
Not a single line in your comment offers anything that machine generation, which is not at all human creative work, falls under fair use.
-
It uses the content in a different way for a different purpose. The part I highlighted above applies to it? Do you expect copyright laws to mention every single type of transformative work acceptable? You are being purposely ignorant.> Do you expect copyright laws to mention every single type of transformative work acceptable? You are being purposely ignorant. I asked nicely to provide a quote that machine generation is also covered that you couldn't provide and now feels the need to lash out. And yes, I absolutely expect that machine generation is explicitly mentioned for the simple fact that right now machine generated anything is not copyrightable at all. A computer isn't smart, a computer isn't creative. It's output doesn't pass the [threshold of originality](https://en.wikipedia.org/wiki/Threshold_of_originality), as such there is no creative transformation happening, as there is with reinterpretations of songs. What is copyrightable are the works that served as training set, therefore there absolutely has to be an explicit mention somewhere that machine generated works do not simply pass the original copyright into the generated work, just like how a human writes source code and the compiled executable is still the human author's work.
-
> Nature is healing. Nah, they're lying. They'll just cover their tracks better in the future.
-
Do human artists usually get consent before training on content freely available on the Internet? There are plenty of reasons to hate on AI, but this reason is just being pissed that a silicon brain did it instead of a carbon one.Humans aren't machines, dummy
-
Right, because computers don't use silicone. But Gen AI is modeled after the way the brain works, so maybe **you** need to learn how it works before arguing against a comparison.>But Gen AI is modeled after the way the brain works, This planet is doomed
-
> Do you expect copyright laws to mention every single type of transformative work acceptable? You are being purposely ignorant. I asked nicely to provide a quote that machine generation is also covered that you couldn't provide and now feels the need to lash out. And yes, I absolutely expect that machine generation is explicitly mentioned for the simple fact that right now machine generated anything is not copyrightable at all. A computer isn't smart, a computer isn't creative. It's output doesn't pass the [threshold of originality](https://en.wikipedia.org/wiki/Threshold_of_originality), as such there is no creative transformation happening, as there is with reinterpretations of songs. What is copyrightable are the works that served as training set, therefore there absolutely has to be an explicit mention somewhere that machine generated works do not simply pass the original copyright into the generated work, just like how a human writes source code and the compiled executable is still the human author's work.>In the Office’s view, training a generative AI foundation model on a large and diverse dataset will often be transformative. The process converts a massive collection of training examples into a statistical model that can generate a wide range of outputs across a diverse array of new situations. It is hard to compare individual works in the training data—for example, copies of The Big Sleep in various languages—with a resulting language model capable of translating emails, correcting grammar, or answering natural language questions about 20th-century literature, without perceiving a transformation. https://www.copyright.gov/ai/Copyright-and-Artificial-Intelligence-Part-3-Generative-AI-Training-Report-Pre-Publication-Version.pdf You can read the whole doc. The part above is cherry picked. I haven't read through the whole thing but at a glance, the doc basically explains how it depends. If the model is trained specifically to output one piece content, it wouldn't be acceptable. The waters are muddy but holy fuck does taking the copyright juggernauts side sound bloody stupid.
-
Alrighty, so generative AI works by giving it training data and it transforms that data and then generates something based on a prompt and how that prompt is related to the training data it has. That's not functionally different from how commissioned human artists work. They train on publicly available works, their brain transforms and stores that data and uses it to generate a work based on a prompt. They even often directly use a reference work to generate their own without permission from the original artist. Like I said, there are tons of valid criticisms against Gen AI, but this criticism just boils down to "AI bad because it's not a human exploiting other's work."GenAI is a glorified Markov Chain. Nothing more. It is a stochastic parrot. It does not think, it is not capable of creating novel new works, and it is incapable of the emotion necessary to be expressive. All it can do is ingest content and replicate it. This is not the same as a human seeing someone’s work and being inspired by it to create something uniquely their own in response.
-
The anti AI crowd is getting crazy. Everyone uses it during development. It's a tool for fuck's sake, what's next? Banning designers from using Photoshop because using it is faster and thus taking jobs from multiple artists who would have to be employed otherwise?
-
GenAI is a glorified Markov Chain. Nothing more. It is a stochastic parrot. It does not think, it is not capable of creating novel new works, and it is incapable of the emotion necessary to be expressive. All it can do is ingest content and replicate it. This is not the same as a human seeing someone’s work and being inspired by it to create something uniquely their own in response.I never claimed that Gen AI has consciousness, or that what they produce has emotions behind it, so I'm not sure why you're focusing on that. I'm specifically talking about the argument that AI is bad because trains on copyrighted material without consent from the artist, which is functionally no different than humans doing the exact same thing. This isn't me defending AI, this is me saying this one specific argument against it is stupid.
-
I never claimed that Gen AI has consciousness, or that what they produce has emotions behind it, so I'm not sure why you're focusing on that. I'm specifically talking about the argument that AI is bad because trains on copyrighted material without consent from the artist, which is functionally no different than humans doing the exact same thing. This isn't me defending AI, this is me saying this one specific argument against it is stupid.
-
My entire post was a rebuttal of the “functionally no different than humans doing the same thing”. Humans take inspiration and use it to express themselves uniquely, genAI just steals and replicates. They are in no way “doing the exact same thing”.So your entire argument is semantics. Gen AI does more than just replicating existing works. You're not going to get the same result with the same prompt; each result will be unique. And I'd argue that the person writing the prompt is the one providing the inspiration to get the software to express what's in their head.
-
You are 1000% correct. I've been yelled at/criticized a few times by people who clearly can't differentiate between the types of "ai".
-
You can always tell the people with no artistic talent because they don't understand how AI is different than digital art software like PhotoShop. And they seem to think that artists should just accept having their life's work stolen and vomited up as slop. Fuck anyone who thinks like this. They think they are entitled to my creativity without doing any of the work. "Everyone is doing it." The absolute degeneration of morality in this era is mind boggling. Have no morals, seek only profit. The fact that so many people cannot take a stand for integrity because of perceived pragmatism is sickening. I hope anyone that thinks like this gets the AI slop filled hell they deserve. And I hope their careers are the next to be axed and replaced by the plagiarism machines.The worst are those people that think they are artists because they typed in a prompt. It's delusional!
-
This post did not contain any content.Just to point out, LLMs are genAI. Lots of code editors provide code suggestions similar to autocorrect/text suggestions using AI. Strictly I doubt any game is made without AI. Not to say it can't be deliberately avoided, but given the lack of opposition to GPT and LLMs I don't see it being considered for avoidance in the same way as art. So Awards with constraints on "any AI usage in development" probably disqualifies most modern games.
-
“Everyone uses it” is just such a dumb argument. I don’t use it, I’ve never committed any code written by genAI. My colleagues don’t use it. Many, many people choose not to use it.I didn't mean it in the literal sense but if it makes you happy, we can pretend that whenever someone says "everyone" they mean it literally.
-
An unethically developed tool that's burning the planet faster with the ultimate goal of starving the working class out of society. Inb4 alarmism lol tell me the fucking lie if you can.Dude, go touch grass, please. This is embarrassing.
-
Not everyone, and it probably multiplies review time 10 fold. Makes maintenance horrible. It doesn't save time, just moves it and makes devs dumber and unable to justify coding choices the AI generates.I mean, it's a tool. You can use a hammer to smash someone's skull in or you can use it to put some nail on a wall. If you see it used like that, it's shitty developers, the AI is not to blame. Don't get me wrong, I do have coworkers who use it like this and it sucks. One literally told to next time tell Copilot directly what to fix when I'm doing a review. But overall it helps if you know how and most importantly when to use it.