A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
This post did not contain any content.
-
This post did not contain any content.
-
Man that title gives me a stroke trying to decipher it.. it almost reads like Tim Sweeney wants Twitter banned but clearly that’s not the case..
-
Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
-
I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
-
It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
-
This post did not contain any content.So he's saying CSAM is free/protected speech? Got it. Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he's done. And for what? Epic is in legal battles to get onto other companies' platforms (Google's Android and Apple's iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I'm not saying he's 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he's saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he'd be against you censoring his "free" and "protected" speech. I just see no good outcomes for what Sweeney is fighting for here. To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).
-
What, taking *child abuse* seriously?
-
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?Yes and they've been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI. Anna's Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they're using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.) So, yes, that is exactly what they're doing. They are training their models on all the data, not just all the legal data.
-
My definition is from *what words mean.* We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won't use the same label to refer to drawings?I already did the “what words mean” thing earlier. -involves a child -is sexual -is abusive (ie, not art) -is material That’s literally every word of CSAM, and it fits. >We need a term to specifically refer to actual photographs of actual child abuse? Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.
-
This post did not contain any content.IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.
-
Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
-
This post did not contain any content.
-
IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.That is a lot of text for someone that couldn't even be bothered to read the first paragraph of the article. >Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content. There ARE victims, lots of them.
-
Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
-
And what if neither happened?Dude, you're just wrong. There seems to be a huge disconnect with you between what the law is. And what you want the law to be. You are not allowed to take an image of someone, photoshop them naked, and distributed it. Period. You are also not allowed to depict child sexual abuse. It doesn't matter if it's not real. It's the depiction of CSA taking place that is illegal.
-
Sure, i think it's weird to really care about loli or furry or any other *niche*, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can't have effective safeguards against that harm it makes sense to restrict it legally.Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. But generating images of adults that don't exist? Or even clearly drawn images that aren't even realistic? I've seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky. Like let's take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone's fucked up fantasy. Yet lots of people want to make that into a thought crime. I've always thought that if there isn't speech out there that makes you feel icky or gross then you don't really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.
-
That is a lot of text for someone that couldn't even be bothered to read the first paragraph of the article. >Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content. There ARE victims, lots of them.That is a lot of text for someone that couldn’t even be bothered to read a comment properly.