Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'

Scheduled Pinned Locked Moved Uncategorized
games
218 Posts 110 Posters 130 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R retrogoblet79@eviltoast.org
    This post did not contain any content.
    R This user is from outside of this forum
    R This user is from outside of this forum
    reksas@sopuli.xyz
    wrote last edited by
    #82
    sweeney sounds like a pedo
    1 Reply Last reply
    1
    0
    • R retrogoblet79@eviltoast.org
      This post did not contain any content.
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #83
      Isn't this the "won't somebody please think of the children" party?
      ? 1 Reply Last reply
      1
      0
      • R retrogoblet79@eviltoast.org
        This post did not contain any content.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #84
        I wonder if his name showed up in the Epstein Files
        1 Reply Last reply
        0
        • ? Guest
          Isn't this the "won't somebody please think of the children" party?
          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #85
          Oh, they're thinking of 'em all right!
          1 Reply Last reply
          1
          0
          • ? Guest
            Man that title gives me a stroke trying to decipher it.. it almost reads like Tim Sweeney wants Twitter banned but clearly that’s not the case..
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #86
            Its a pretty clear title...he loves child porn.
            1 Reply Last reply
            1
            0
            • ? Guest
              Literally this meme again ![](https://media.piefed.ca/posts/gF/dP/gFdP084Q14ddxyF.jpeg)
              ? Offline
              ? Offline
              Guest
              wrote last edited by
              #87
              It's called being so effective at marketing and spending so much money on it that people believe you don't do nothing.
              1 Reply Last reply
              1
              0
              • M mindbleach@sh.itjust.works
                Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #88
                Please send me pictures of your mom so that I may draw her naked and post it on the internet.
                M 1 Reply Last reply
                0
                • M mindbleach@sh.itjust.works
                  I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #89
                  That is not what CSAM means.
                  1 Reply Last reply
                  1
                  0
                  • M mindbleach@sh.itjust.works
                    It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
                    ? Offline
                    ? Offline
                    Guest
                    wrote last edited by
                    #90
                    Abuse doesn’t have to by physical you stupid fucking piece of shit. See? My words didn’t even touch you.
                    1 Reply Last reply
                    1
                    0
                    • R retrogoblet79@eviltoast.org
                      This post did not contain any content.
                      ? Offline
                      ? Offline
                      Guest
                      wrote last edited by
                      #91
                      So he's saying CSAM is free/protected speech? Got it. Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he's done. And for what? Epic is in legal battles to get onto other companies' platforms (Google's Android and Apple's iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I'm not saying he's 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he's saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he'd be against you censoring his "free" and "protected" speech. I just see no good outcomes for what Sweeney is fighting for here. To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).
                      1 Reply Last reply
                      0
                      • M mindbleach@sh.itjust.works
                        What, taking *child abuse* seriously?
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #92
                        ‘I take child abuse seriously but also think it’s fine to generate nude pictures of real life children.’ Idk man. It’s a weird fuckin thing to admit to.
                        M 1 Reply Last reply
                        0
                        • M mindbleach@sh.itjust.works
                          You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
                          ? Offline
                          ? Offline
                          Guest
                          wrote last edited by
                          #93
                          Yes and they've been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI. Anna's Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they're using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.) So, yes, that is exactly what they're doing. They are training their models on all the data, not just all the legal data.
                          M 1 Reply Last reply
                          1
                          0
                          • M mindbleach@sh.itjust.works
                            My definition is from *what words mean.* We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won't use the same label to refer to drawings?
                            D This user is from outside of this forum
                            D This user is from outside of this forum
                            deranger@sh.itjust.works
                            wrote last edited by
                            #94
                            I already did the “what words mean” thing earlier. -involves a child -is sexual -is abusive (ie, not art) -is material That’s literally every word of CSAM, and it fits. >We need a term to specifically refer to actual photographs of actual child abuse? Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.
                            M 1 Reply Last reply
                            1
                            0
                            • R retrogoblet79@eviltoast.org
                              This post did not contain any content.
                              Lvxferre [he/him]L This user is from outside of this forum
                              Lvxferre [he/him]L This user is from outside of this forum
                              Lvxferre [he/him]
                              wrote last edited by
                              #95
                              IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.
                              ? ? 3 Replies Last reply
                              1
                              0
                              • M mindbleach@sh.itjust.works
                                Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
                                ? Offline
                                ? Offline
                                Guest
                                wrote last edited by
                                #96
                                Is it a sexualized depiction of a minor? Then it's csam. Fuck all y'all pedo apologists.
                                1 Reply Last reply
                                1
                                0
                                • R retrogoblet79@eviltoast.org
                                  This post did not contain any content.
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #97
                                  Tim Epic sucks, and has always sucked.
                                  D 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 K 2 Replies Last reply
                                  1
                                  0
                                  • Lvxferre [he/him]L Lvxferre [he/him]
                                    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it's fine to change them as you need. The real thing to talk about is the presence or absence of a victim. Non-consensual porn victimises the person being depicted, because it violates the person's rights over their own body — including its image. Plus it's ripe material for harassment. This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing. And it applies to children *and adults*. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus victimising the children in question. Now, someone else mentioned Bart's dick appears in the Simpsons movie. The key difference is that Bart is not a child, ***it*** is not even a person to begin with, ***it*** is a fictional character. There's no victim.
                                    ? Offline
                                    ? Offline
                                    Guest
                                    wrote last edited by
                                    #98
                                    That is a lot of text for someone that couldn't even be bothered to read the first paragraph of the article. >Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content. There ARE victims, lots of them.
                                    U Lvxferre [he/him]L 2 Replies Last reply
                                    1
                                    0
                                    • M mindbleach@sh.itjust.works
                                      Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #99
                                      Deepfakes are illegal. You're defending deepfake cp now?
                                      M 1 Reply Last reply
                                      1
                                      0
                                      • M mindbleach@sh.itjust.works
                                        And what if neither happened?
                                        ? Offline
                                        ? Offline
                                        Guest
                                        wrote last edited by
                                        #100
                                        Dude, you're just wrong. There seems to be a huge disconnect with you between what the law is. And what you want the law to be. You are not allowed to take an image of someone, photoshop them naked, and distributed it. Period. You are also not allowed to depict child sexual abuse. It doesn't matter if it's not real. It's the depiction of CSA taking place that is illegal.
                                        M 2 Replies Last reply
                                        0
                                        • shani66@ani.socialS shani66@ani.social
                                          Sure, i think it's weird to really care about loli or furry or any other *niche*, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can't have effective safeguards against that harm it makes sense to restrict it legally.
                                          ? Offline
                                          ? Offline
                                          Guest
                                          wrote last edited by
                                          #101
                                          Making porn of actual people without their consent regardless of age is not a thought crime. For children, that's obviously fucked up. For adults it's directly impacting their reputation. It's not a victimless crime. But generating images of adults that don't exist? Or even clearly drawn images that aren't even realistic? I've seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky. Like let's take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone's fucked up fantasy. Yet lots of people want to make that into a thought crime. I've always thought that if there isn't speech out there that makes you feel icky or gross then you don't really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.
                                          A ? 2 Replies Last reply
                                          1
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 3
                                          • 4
                                          • 5
                                          • 6
                                          • 7
                                          • 8
                                          • 9
                                          • 10
                                          • 11
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups