Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'

Scheduled Pinned Locked Moved Uncategorized
games
218 Posts 110 Posters 132 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R retrogoblet79@eviltoast.org
    This post did not contain any content.
    ? Offline
    ? Offline
    Guest
    wrote last edited by
    #11
    Are they *removed* or just this deep up their own assholes?
    1 Reply Last reply
    1
    0
    • R retrogoblet79@eviltoast.org
      This post did not contain any content.
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #12
      Yet another CEO who's super into child porn huh?
      ? 1 Reply Last reply
      1
      0
      • R retrogoblet79@eviltoast.org
        This post did not contain any content.
        B This user is from outside of this forum
        B This user is from outside of this forum
        brucethemoose@lemmy.world
        wrote last edited by
        #13
        Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account. It’s like he’s *hell bent* on driving people away.
        ? 1 Reply Last reply
        1
        0
        • B brucethemoose@lemmy.world
          Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account. It’s like he’s *hell bent* on driving people away.
          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #14
          Not just asshole. Nonce asshole.
          1 Reply Last reply
          1
          0
          • ? Guest
            I'll keep in mind Tim think child porn is just politics.
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #15
            It is when one side of the political palette is "against" it but keeps supporting people who think CSAM is a-okay, while the other side finds it abhorrent regardless who's pushing it.
            1 Reply Last reply
            0
            • ? Guest
              If you can be effectively censored by the banning of a site flooded with CSAM, that's very much your problem and nobody else's.
              M This user is from outside of this forum
              M This user is from outside of this forum
              mindbleach@sh.itjust.works
              wrote last edited by
              #16
              Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
              ? ? ? ? 4 Replies Last reply
              1
              0
              • R retrogoblet79@eviltoast.org
                This post did not contain any content.
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #17
                This is SO SURPRISING that in a Country run by JEFFREY EPSTEINS BEST FRIEND who's using YOUR TAX DOLLARS to Protect Pedophiles and Child Sex Traffickers that people would call the CREATION OF CHILD PORN A PROTECTED ACT!
                1 Reply Last reply
                1
                0
                • ? Guest
                  inb4 "In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech"
                  M This user is from outside of this forum
                  M This user is from outside of this forum
                  mindbleach@sh.itjust.works
                  wrote last edited by
                  #18
                  There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
                  ? D ? 3 Replies Last reply
                  0
                  • R retrogoblet79@eviltoast.org
                    This post did not contain any content.
                    M This user is from outside of this forum
                    M This user is from outside of this forum
                    mindbleach@sh.itjust.works
                    wrote last edited by
                    #19
                    I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
                    RimuR ? ? 3 Replies Last reply
                    1
                    0
                    • R retrogoblet79@eviltoast.org
                      This post did not contain any content.
                      ? Offline
                      ? Offline
                      Guest
                      wrote last edited by
                      #20
                      Tim Sweeney is into child porn
                      1 Reply Last reply
                      0
                      • M mindbleach@sh.itjust.works
                        There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #21
                        Dude, just stop jerking off to kids whether they’re cartoons or not.
                        M 1 Reply Last reply
                        0
                        • ? Guest
                          Dude, just stop jerking off to kids whether they’re cartoons or not.
                          M This user is from outside of this forum
                          M This user is from outside of this forum
                          mindbleach@sh.itjust.works
                          wrote last edited by
                          #22
                          'If you care about child abuse please stop conflating it with cartoons.' *'Pedo.'* Fuck off.
                          1 Reply Last reply
                          1
                          0
                          • M mindbleach@sh.itjust.works
                            I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
                            RimuR This user is from outside of this forum
                            RimuR This user is from outside of this forum
                            Rimu
                            wrote last edited by
                            #23
                            Strange hill to die on, man.
                            M 1 Reply Last reply
                            0
                            • RimuR Rimu
                              Strange hill to die on, man.
                              M This user is from outside of this forum
                              M This user is from outside of this forum
                              mindbleach@sh.itjust.works
                              wrote last edited by
                              #24
                              What, taking *child abuse* seriously?
                              ? ? 2 Replies Last reply
                              1
                              0
                              • M mindbleach@sh.itjust.works
                                There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
                                D This user is from outside of this forum
                                D This user is from outside of this forum
                                deranger@sh.itjust.works
                                wrote last edited by
                                #25
                                Generating images of a minor can certainly fulfill the definition CSAM. It’s a child It’s sexual It’s abusive It’s material It’s CSAM dude. These _are_ the images you report to the FBI. Your narrow definition is not _the_ definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
                                M 1 Reply Last reply
                                0
                                • M mindbleach@sh.itjust.works
                                  Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #26
                                  It's too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
                                  M ? 2 Replies Last reply
                                  1
                                  0
                                  • D deranger@sh.itjust.works
                                    Generating images of a minor can certainly fulfill the definition CSAM. It’s a child It’s sexual It’s abusive It’s material It’s CSAM dude. These _are_ the images you report to the FBI. Your narrow definition is not _the_ definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
                                    M This user is from outside of this forum
                                    M This user is from outside of this forum
                                    mindbleach@sh.itjust.works
                                    wrote last edited by
                                    #27
                                    There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean 'shit what looks like it could be from the abuse of some child I guess.' It means, state's evidence of actual crimes.
                                    D ? 2 Replies Last reply
                                    0
                                    • R retrogoblet79@eviltoast.org
                                      This post did not contain any content.
                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #28
                                      I wonder which AI companies he’s invested in
                                      1 Reply Last reply
                                      1
                                      0
                                      • ? Guest
                                        It's too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
                                        M This user is from outside of this forum
                                        M This user is from outside of this forum
                                        mindbleach@sh.itjust.works
                                        wrote last edited by
                                        #29
                                        You can insist every frame of Bart Simspon's dick in The Simpsons Movie should be *as illegal as* photographic evidence of child rape, but that does not make them the same thing. The *entire point* of the term CSAM is that it's the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
                                        D ? 3 Replies Last reply
                                        1
                                        0
                                        • R retrogoblet79@eviltoast.org
                                          This post did not contain any content.
                                          skullgrid@lemmy.worldS This user is from outside of this forum
                                          skullgrid@lemmy.worldS This user is from outside of this forum
                                          skullgrid@lemmy.world
                                          wrote last edited by
                                          #30
                                          Bro should have stopped after UE3 and fucked off forever
                                          1 Reply Last reply
                                          1
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 3
                                          • 4
                                          • 5
                                          • 6
                                          • 10
                                          • 11
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups