A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
-
Did Covid-19 make everyone lose their minds? This isn't even about being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026... for Deepfake porn including of minors!!???? From the Fortnite company guy???> Did Covid-19 make everyone lose their minds? Every day further convinces me we all died of COVID, and this is The Bad Place.
-
This post did not contain any content.
-
IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.What exactly have I lied about? I've never once tried to even insinuate that what grok is doing ok. Nor that it should be. What I've said. Is that it doesn't even matter if there are an actual real person being victimized or not. It's still illegal. No matter how you look at it. It's illegal. Fictional or not. Your example of Bart in the Simpsons movie is so far out of place I hardly know where to begin. It's NOT because he's fictional. Because fictional depictions of naked children in sexually compromised situations IS illegal. Though I am glad you don't have a dog. It would be real awkward for the dog to always be the smartest being in the house.
-
Fuck! I misread you. Yes, you're right, *Tim Sweeney* is actually supporting CSAM. Sorry for the misunderstanding, undeserved crankiness, and defensiveness; I thought you were claiming I was the one doing it.
-
If you seriously think that there is no moral difference between someone being raped and them not being raped then maybe you should be in prison for all our safety.
-
>Yes, it certainly comes across as you arguing for the opposite No, it does not. Stop being a liar.
-
That's not what I said. How are you this stupid? I said I think they are both, equally morally reprehensible. They both belong in the very bottom of Dante's inferno.> I don't think there's a moral difference between depicting "victimless" CSAM and CSAM containing a real person. I think they're both, morally, equally awful. You called them "equally awful", so yes, that is what you said.
-
I saw someone suggesting mass downloading and uninstalling and downloading again all of the free games they've claimed over the yearsThat's a lot of interaction for a boycott, and I'm sure they would just ban your IP at some point. Of course there's always ways around that, but how much effort do you want to put into this boycott? The biggest impact you could have on them would be for everyone to go over to Steam OS which I don't believe they support. It would be hilarious if they were forced to add support in order to stay relevant. I don't think anything else would have much of an affect, because like I said their target demographic are kids, who don't really pay attention to this stuff.
-
That's not even what gatekeeping means. Unless he's trying to stand up for the universal right to participate in the child porn fandom.
-
> I don't think there's a moral difference between depicting "victimless" CSAM and CSAM containing a real person. I think they're both, morally, equally awful. You called them "equally awful", so yes, that is what you said.
-
This post did not contain any content.
-
This post did not contain any content.
-
That's a lot of interaction for a boycott, and I'm sure they would just ban your IP at some point. Of course there's always ways around that, but how much effort do you want to put into this boycott? The biggest impact you could have on them would be for everyone to go over to Steam OS which I don't believe they support. It would be hilarious if they were forced to add support in order to stay relevant. I don't think anything else would have much of an affect, because like I said their target demographic are kids, who don't really pay attention to this stuff.
-
I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
-
Yes. They are both the worst of the worst. I place both in the very bottom of Dantes inferno. Or do you still struggle to understand what that means?I do understand, i just think that you are very weird for thinking of them as equal.
-
Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. But generating images of adults that don't exist? Or even clearly drawn images that aren't even realistic? I've seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky. Like let's take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone's fucked up fantasy. Yet lots of people want to make that into a thought crime. I've always thought that if there isn't speech out there that makes you feel icky or gross then you don't really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.> Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. That is also drawing a certain arrangement of lines and colours.
-
I do understand, i just think that you are very weird for thinking of them as equal.
-
This post did not contain any content.
-
> Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. That is also drawing a certain arrangement of lines and colours.
-
I bet hes kind of right, here in the UK we just lost a whole bunch of rights and privacies online under the guise of "protect the kids" but its kind of weird to be piping up against it when theres actually protections needed.This isn't really a change, though, I'm pretty sure. People have been able to make photo-realistic depictions a lot longer than AI has existed and those have rightfully been held to be illegal in most places because the confusion it causes makes it harder to stop the real thing.