A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
This post did not contain any content.
-
G Games shared this topic
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.I hate everything Musk is associated with and I have never and will never spend a dime in the epic game store. That said, I'm willing to stretch pretty far to say this quote might be more about his favoured type of reprehensible speech than csam: > All major AIs have documented instances of going off the rails; all major AI companies make their best efforts to combat this; none are perfect. Politicians demanding gatekeepers selectively crush the one that's their political opponent's company is basic crony capitalism. He wouldn't have a leg to stand on if Musk ignored/laughed off the whole thing instead of stupidly threatening the users posting the images but I guess that's the least we can expect these days instead of actual accountability.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account. It’s like he’s *hell bent* on driving people away.
-
Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account. It’s like he’s *hell bent* on driving people away.
-
If you can be effectively censored by the banning of a site flooded with CSAM, that's very much your problem and nobody else's.Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
This post did not contain any content.
-
inb4 "In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech"There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
-
This post did not contain any content.I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
-
This post did not contain any content.