A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.
-
CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.CSAM is material.... from the sexual abuse... of a child. Fiction does not count.
-
There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean 'shit what looks like it could be from the abuse of some child I guess.' It means, state's evidence of actual crimes.
-
It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what's happened. These kids did not consent to have their likeness sexualised.Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
-
Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
-
CSAM is material.... from the sexual abuse... of a child. Fiction does not count.You’re the only one using that definition.
-
CSAM is material.... from the sexual abuse... of a child. Fiction does not count.
-
Your likeness depicted on the moon does not mean you went to the moon.
-
What, taking *child abuse* seriously?
-
Your likeness depicted on the moon does not mean you went to the moon.
-
This post did not contain any content.HAHAHAHA Just stop the corn bro... IS everything the lowest common denominator... beatin meat.... I don't eat when I sh!+... Like time and place.... Tim you a straight arse ladder pulling enshittificating boomer reprobate. This pud skims the surface like the scum that he is. One day the master race will get all the gold coins and build a circus that no one can afford. Finger crossed!!! Nukes_2026
-
You’re the only one using that definition.My definition is from *what words mean.* We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won't use the same label to refer to drawings?
-
Your kindness modified naked being fucked, printed out and stapled to a tree in your neighborhood is ok then?A threat of murder is a crime without being the same thing as murder. Meditate on this.
-
Isn't it abuse if i take a picture of a girl, let grok remove the clothes and post this online?It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
-
AI CSAM was generated from real CSAM AI being able to accurately undress kids is a real issue in multiple waysAI can draw Shrek on the moon. Do you think it needed real images of that?