A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
And you think it's short on images of fully naked women?I'm saying it can't combine clothed children and naked adults to make naked children. It doesn't know what "naked" means. It can't imagine what something might look like. It can only make naked children if it has been trained on them directly.
-
It doesn't matter if there's a victim or not. It's the depiction of CSA that is illegal. So no, talking about whatever or not there's a victim is not the most important part. It doesn't matter if you draw it by hand with crayons. If it's depicting CSA it's illegal.Nobody was talking about the "legality". We are talking about morals. And morally there is major difference.
-
This post did not contain any content.The only “charitable” take I can give this is that he’s been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well. I don’t know why CSAM AI material is the hill you’d make this point with though.
-
I'm saying it can't combine clothed children and naked adults to make naked children. It doesn't know what "naked" means. It can't imagine what something might look like. It can only make naked children if it has been trained on them directly.Incorrect.
-
Dude, you're just wrong. There seems to be a huge disconnect with you between what the law is. And what you want the law to be. You are not allowed to take an image of someone, photoshop them naked, and distributed it. Period. You are also not allowed to depict child sexual abuse. It doesn't matter if it's not real. It's the depiction of CSA taking place that is illegal.'This too is a crime, but it's a different crime.' *'Why are you defending this?'* Wrong.
-
Threats are a crime, but they're a different crime than the act itself. Everyone piling on understands that it's kinda fuckin' important to distinguish this crime, specifically, because it's the worst thing imaginable. They just also want to use the same word for shit that did not happen. Both things can be super fucking illegal - but they will never be the same thing.
-
Yes and they've been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI. Anna's Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they're using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.) So, yes, that is exactly what they're doing. They are training their models on all the data, not just all the legal data.It's big fucking news when those datasets contain, like, three JPEGs. Because even one such JPEG is an event where the FBI shows up and blasts the entire hard drive into shrapnel. Y'all insisting there's gotta be some clearly-labeled archive with a shitload of the most illegal images imaginable, in order for the robot that combines concepts to combine the concept of "child" and the concept of "naked," are not taking yourselves seriously. You're just shuffling cards to bolster a kneejerk feeling.
-
That doesn't stop everyone from directly calling elon musk a pedophile for creating a CP generating machine.And nor should it.
-
No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.True enough - but fortunately, there's approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.
-
‘I take child abuse seriously but also think it’s fine to generate nude pictures of real life children.’ Idk man. It’s a weird fuckin thing to admit to.Show me where anyone said that. Circle it in red.
-
Do you think that generated depictions of the sexual abuse of children are ok in any context?How often do I have to say *this is still a crime* before y'all stop having a different argument inside your heads?
-
This post did not contain any content.This is almost as sus as the the specific preferred age range terminology for pedophiles that comes up now and again in the most uncomfortable of scenarios
-
Please send me pictures of your mom so that I may draw her naked and post it on the internet.Do you understand that's a different thing than telling me you've fucked her?
-
I already did the “what words mean” thing earlier. -involves a child -is sexual -is abusive (ie, not art) -is material That’s literally every word of CSAM, and it fits. >We need a term to specifically refer to actual photographs of actual child abuse? Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.Are you honestly asking me why child molestation is worse than rendering an image? This term was already developed to distinguish evidence of criminal events. I should fucking hope everyone here understands why preventing or punishing such events is a leading goal, but apparently that's asking too much, if y'all really do not believe there's a difference between pasting someone's head onto a magazine centerfold... versus sexually assaulting them. I am fucking bewildered by this lack of consensus on the topic of *child rape.* Really thought it was a gimme, for everyone to go, yeah, this thing over here is bad, but obviously it's not as bad as *child rape.* Didn't expect to fire up the computer and have Lemmings sincerely ask me, why are crimes that happened worse than crimes that didn't?
-
This post did not contain any content.Zionazi oligarchist supremacism controlling media/speech promoting hate and genocide is reason to zero out his finances and media control. That bipartisan establishment loves all of this, means this performative whining over image generation tools that can be used to fake offense, is the permitted pathethic discourse establishment masquerades as democracy.
-
This post did not contain any content.I'm no fan of banning this or that particular platform (it's like trying to get rid of cheeseburgers by banning McDonalds; the burgers are still available from all the other burger chains and all the people who use the one will just switch to others) but this is a hilariously wrong way to get to the right answer.
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.> I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime I'm sorry to break it to you, but this has been illegal for a long time and it doesn't need to have anything to do with CSAM. For instance, drawing certain copyrighted material in certain contexts can be illegal. To go even further, numbers and maths can be illegal in the right circumstances. For instance, it may be illegal where you live to break the encryption of a certain file, depending on the file and encryption in question (e.g. DRM on copyrighted material). "Breaking the encryption of a file" essentially translates to "doing maths on a number" when you boil it down. That's how you can end up with the concept of [illegal numbers](https://en.wikipedia.org/wiki/Illegal_number).
-
It helps that Tim Sweeney seems to always be wrong about everything.
-
Dude, you're just wrong. There seems to be a huge disconnect with you between what the law is. And what you want the law to be. You are not allowed to take an image of someone, photoshop them naked, and distributed it. Period. You are also not allowed to depict child sexual abuse. It doesn't matter if it's not real. It's the depiction of CSA taking place that is illegal.Depictions could somehow be twice as illegal as the real event, and they still wouldn't be the same thing. It literally did not take place.