A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
-
AI CSAM was generated from real CSAM AI being able to accurately undress kids is a real issue in multiple waysAI can draw Shrek on the moon. Do you think it needed real images of that?
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
-
A threat of murder is a crime without being the same thing as murder. Meditate on this.
-
And what if neither happened?
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.Sure, i think it's weird to really care about loli or furry or any other *niche*, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can't have effective safeguards against that harm it makes sense to restrict it legally.
-
It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*When all her friends and family see that image, she is definitely involved. And it's definitely abuse.
-
AI can draw Shrek on the moon. Do you think it needed real images of that?
-
It used real images of shrek and the moon to do that. It didnt "invent" or "imagine" either. The child porn it's generating is based on literal child porn, if not itself just actual child porn.You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
-
Does a depiction of her corpse mean she's dead?
-
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?It literally can't combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn't make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that's generally how we prefer to photograph wine. It has no concept of "full" the way actual intelligences do, so it couldn't connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.
-
This post did not contain any content.
-
It literally can't combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn't make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that's generally how we prefer to photograph wine. It has no concept of "full" the way actual intelligences do, so it couldn't connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.And you think it's short on images of fully naked women?
-
This post did not contain any content.Guy atomically made of shit takes has another shit take, colour me surprised.
-
Does a depiction of her corpse mean she's dead?False equivalence.
-
False equivalence.The central goddamn point. *Depicting* things happening does not mean they *actually happened.* The entire point of the term CSAM is to describe crimes which literally occurred. It is material... from the sexual abuse... of children. Do y'all not understand why it's kinda fuckin' important to have a term for that specific concept?
-
The central goddamn point. *Depicting* things happening does not mean they *actually happened.* The entire point of the term CSAM is to describe crimes which literally occurred. It is material... from the sexual abuse... of children. Do y'all not understand why it's kinda fuckin' important to have a term for that specific concept?Do you think that generated depictions of the sexual abuse of children are ok in any context?
-
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?