A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
I hate that the newest Unreal Tournament just kinda... Disappeared. I mean, it's still playable I think, just not online and aside from a week or so after it launched, I ain't ever heard anyone talking about it. It was *okay...* Balance was not quite there, and it only had 2 maps when I last played it. But it had potential.
-
This post did not contain any content.Yeah. Iโm as tired of the argument that pretty much anything goes as far as free speech goes as I am of the โeverything is a slippery slope when we make laws to keep people from doing harmful shit.โ I mean whatโs the required damage before people put a stop to inciteful speech and objectively harmful lies? Germany had to kill a few million people before they decided that maybe displaying Nazi symbols and speech wasnโt a good idea. So we have a platform being used to make CSAM. There should be immediate action to end the means to do so, but these tools make up all kinds of reasons why we canโt do thatโฆeconomic, censorship, whatever.
-
Which they then talk about and point out that victims are absolutely present in this case... If this is still too hard to understand i will simplify the sentence. They are saying: "The important thing to talk about is, whether there is a victim or not."
-
Tim EpicFail.
-
I hate that the newest Unreal Tournament just kinda... Disappeared. I mean, it's still playable I think, just not online and aside from a week or so after it launched, I ain't ever heard anyone talking about it. It was *okay...* Balance was not quite there, and it only had 2 maps when I last played it. But it had potential.
-
Absolutely insane take. The reason Grok can generate CP is because it was trained on it. Musk should be arrested just for owning that shit.We all live in a two tier justice system. The one tier is for the capital class. Generally, as long as they don't commit crimes against the government or others in the capital class. These offenders get the slap on the wrist justice system. The Government had enough evidence between witnesses and documentary evidence from the Epstein files to atleast open investigations and charge some of the people. The only people to be arrested and charged were Epstein and Maxwell. It took a long time before either of them faced any serious consequences for their actions. Everyone else gets the go fuck yourself justice system.
-
We all live in a two tier justice system. The one tier is for the capital class. Generally, as long as they don't commit crimes against the government or others in the capital class. These offenders get the slap on the wrist justice system. The Government had enough evidence between witnesses and documentary evidence from the Epstein files to atleast open investigations and charge some of the people. The only people to be arrested and charged were Epstein and Maxwell. It took a long time before either of them faced any serious consequences for their actions. Everyone else gets the go fuck yourself justice system.
-
This post did not contain any content.
-
Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. But generating images of adults that don't exist? Or even clearly drawn images that aren't even realistic? I've seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky. Like let's take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone's fucked up fantasy. Yet lots of people want to make that into a thought crime. I've always thought that if there isn't speech out there that makes you feel icky or gross then you don't really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.Drawings are one conversation I won't get into. GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing `NPM_KEY=`. You can't have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.
-
Which they then talk about and point out that victims are absolutely present in this case... If this is still too hard to understand i will simplify the sentence. They are saying: "The important thing to talk about is, whether there is a victim or not."It doesn't matter if there's a victim or not. It's the depiction of CSA that is illegal. So no, talking about whatever or not there's a victim is not the most important part. It doesn't matter if you draw it by hand with crayons. If it's depicting CSA it's illegal.
-
And you think it's short on images of fully naked women?I'm saying it can't combine clothed children and naked adults to make naked children. It doesn't know what "naked" means. It can't imagine what something might look like. It can only make naked children if it has been trained on them directly.
-
It doesn't matter if there's a victim or not. It's the depiction of CSA that is illegal. So no, talking about whatever or not there's a victim is not the most important part. It doesn't matter if you draw it by hand with crayons. If it's depicting CSA it's illegal.Nobody was talking about the "legality". We are talking about morals. And morally there is major difference.
-
This post did not contain any content.The only โcharitableโ take I can give this is that heโs been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well. I donโt know why CSAM AI material is the hill youโd make this point with though.
-
I'm saying it can't combine clothed children and naked adults to make naked children. It doesn't know what "naked" means. It can't imagine what something might look like. It can only make naked children if it has been trained on them directly.Incorrect.
-
Dude, you're just wrong. There seems to be a huge disconnect with you between what the law is. And what you want the law to be. You are not allowed to take an image of someone, photoshop them naked, and distributed it. Period. You are also not allowed to depict child sexual abuse. It doesn't matter if it's not real. It's the depiction of CSA taking place that is illegal.'This too is a crime, but it's a different crime.' *'Why are you defending this?'* Wrong.
-
Threats are a crime, but they're a different crime than the act itself. Everyone piling on understands that it's kinda fuckin' important to distinguish this crime, specifically, because it's the worst thing imaginable. They just also want to use the same word for shit that did not happen. Both things can be super fucking illegal - but they will never be the same thing.
-
Yes and they've been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI. Anna's Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they're using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.) So, yes, that is exactly what they're doing. They are training their models on all the data, not just all the legal data.It's big fucking news when those datasets contain, like, three JPEGs. Because even one such JPEG is an event where the FBI shows up and blasts the entire hard drive into shrapnel. Y'all insisting there's gotta be some clearly-labeled archive with a shitload of the most illegal images imaginable, in order for the robot that combines concepts to combine the concept of "child" and the concept of "naked," are not taking yourselves seriously. You're just shuffling cards to bolster a kneejerk feeling.
-
That doesn't stop everyone from directly calling elon musk a pedophile for creating a CP generating machine.And nor should it.
-
No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.True enough - but fortunately, there's approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.
-
โI take child abuse seriously but also think itโs fine to generate nude pictures of real life children.โ Idk man. Itโs a weird fuckin thing to admit to.Show me where anyone said that. Circle it in red.