A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.
-
This post did not contain any content.sweeney sounds like a pedo
-
This post did not contain any content.
-
This post did not contain any content.
-
Man that title gives me a stroke trying to decipher it.. it almost reads like Tim Sweeney wants Twitter banned but clearly that’s not the case..
-
Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
-
I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
-
It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
-
This post did not contain any content.So he's saying CSAM is free/protected speech? Got it. Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he's done. And for what? Epic is in legal battles to get onto other companies' platforms (Google's Android and Apple's iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I'm not saying he's 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he's saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he'd be against you censoring his "free" and "protected" speech. I just see no good outcomes for what Sweeney is fighting for here. To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).
-
What, taking *child abuse* seriously?
-
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?Yes and they've been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI. Anna's Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they're using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.) So, yes, that is exactly what they're doing. They are training their models on all the data, not just all the legal data.
-
My definition is from *what words mean.* We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won't use the same label to refer to drawings?I already did the “what words mean” thing earlier. -involves a child -is sexual -is abusive (ie, not art) -is material That’s literally every word of CSAM, and it fits. >We need a term to specifically refer to actual photographs of actual child abuse? Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.
-
This post did not contain any content.IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.