A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
CSAM is material.... from the sexual abuse... of a child. Fiction does not count.You’re the only one using that definition.
-
CSAM is material.... from the sexual abuse... of a child. Fiction does not count.
-
Your likeness depicted on the moon does not mean you went to the moon.
-
What, taking *child abuse* seriously?
-
Your likeness depicted on the moon does not mean you went to the moon.
-
This post did not contain any content.HAHAHAHA Just stop the corn bro... IS everything the lowest common denominator... beatin meat.... I don't eat when I sh!+... Like time and place.... Tim you a straight arse ladder pulling enshittificating boomer reprobate. This pud skims the surface like the scum that he is. One day the master race will get all the gold coins and build a circus that no one can afford. Finger crossed!!! Nukes_2026
-
You’re the only one using that definition.My definition is from *what words mean.* We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won't use the same label to refer to drawings?
-
Your kindness modified naked being fucked, printed out and stapled to a tree in your neighborhood is ok then?A threat of murder is a crime without being the same thing as murder. Meditate on this.
-
Isn't it abuse if i take a picture of a girl, let grok remove the clothes and post this online?It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
-
AI CSAM was generated from real CSAM AI being able to accurately undress kids is a real issue in multiple waysAI can draw Shrek on the moon. Do you think it needed real images of that?
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
-
A threat of murder is a crime without being the same thing as murder. Meditate on this.
-
And what if neither happened?
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.Sure, i think it's weird to really care about loli or furry or any other *niche*, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can't have effective safeguards against that harm it makes sense to restrict it legally.
-
It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*When all her friends and family see that image, she is definitely involved. And it's definitely abuse.