Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'

Scheduled Pinned Locked Moved Uncategorized
games
218 Posts 110 Posters 132 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • RimuR Rimu
    Strange hill to die on, man.
    M This user is from outside of this forum
    M This user is from outside of this forum
    mindbleach@sh.itjust.works
    wrote last edited by
    #24
    What, taking *child abuse* seriously?
    ? ? 2 Replies Last reply
    1
    0
    • M mindbleach@sh.itjust.works
      There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
      D This user is from outside of this forum
      D This user is from outside of this forum
      deranger@sh.itjust.works
      wrote last edited by
      #25
      Generating images of a minor can certainly fulfill the definition CSAM. It’s a child It’s sexual It’s abusive It’s material It’s CSAM dude. These _are_ the images you report to the FBI. Your narrow definition is not _the_ definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
      M 1 Reply Last reply
      0
      • M mindbleach@sh.itjust.works
        Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #26
        It's too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
        M ? 2 Replies Last reply
        1
        0
        • D deranger@sh.itjust.works
          Generating images of a minor can certainly fulfill the definition CSAM. It’s a child It’s sexual It’s abusive It’s material It’s CSAM dude. These _are_ the images you report to the FBI. Your narrow definition is not _the_ definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
          M This user is from outside of this forum
          M This user is from outside of this forum
          mindbleach@sh.itjust.works
          wrote last edited by
          #27
          There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean 'shit what looks like it could be from the abuse of some child I guess.' It means, state's evidence of actual crimes.
          D ? 2 Replies Last reply
          0
          • R retrogoblet79@eviltoast.org
            This post did not contain any content.
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #28
            I wonder which AI companies he’s invested in
            1 Reply Last reply
            1
            0
            • ? Guest
              It's too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
              M This user is from outside of this forum
              M This user is from outside of this forum
              mindbleach@sh.itjust.works
              wrote last edited by
              #29
              You can insist every frame of Bart Simspon's dick in The Simpsons Movie should be *as illegal as* photographic evidence of child rape, but that does not make them the same thing. The *entire point* of the term CSAM is that it's the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
              D ? 3 Replies Last reply
              1
              0
              • R retrogoblet79@eviltoast.org
                This post did not contain any content.
                skullgrid@lemmy.worldS This user is from outside of this forum
                skullgrid@lemmy.worldS This user is from outside of this forum
                skullgrid@lemmy.world
                wrote last edited by
                #30
                Bro should have stopped after UE3 and fucked off forever
                1 Reply Last reply
                1
                0
                • M mindbleach@sh.itjust.works
                  There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean 'shit what looks like it could be from the abuse of some child I guess.' It means, state's evidence of actual crimes.
                  D This user is from outside of this forum
                  D This user is from outside of this forum
                  deranger@sh.itjust.works
                  wrote last edited by
                  #31
                  CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.
                  M 1 Reply Last reply
                  1
                  0
                  • M mindbleach@sh.itjust.works
                    You can insist every frame of Bart Simspon's dick in The Simpsons Movie should be *as illegal as* photographic evidence of child rape, but that does not make them the same thing. The *entire point* of the term CSAM is that it's the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
                    D This user is from outside of this forum
                    D This user is from outside of this forum
                    deranger@sh.itjust.works
                    wrote last edited by
                    #32
                    >The entire point of the term CSAM is that it's the actual real evidence of child rape. No it isn’t.
                    1 Reply Last reply
                    0
                    • M mindbleach@sh.itjust.works
                      You can insist every frame of Bart Simspon's dick in The Simpsons Movie should be *as illegal as* photographic evidence of child rape, but that does not make them the same thing. The *entire point* of the term CSAM is that it's the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
                      D This user is from outside of this forum
                      D This user is from outside of this forum
                      deranger@sh.itjust.works
                      wrote last edited by
                      #33
                      >The \*entire point\* of the term CSAM is that it's the actual real evidence of child rape. You are completely wrong. https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/ “Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.”
                      M 1 Reply Last reply
                      1
                      0
                      • M mindbleach@sh.itjust.works
                        You can insist every frame of Bart Simspon's dick in The Simpsons Movie should be *as illegal as* photographic evidence of child rape, but that does not make them the same thing. The *entire point* of the term CSAM is that it's the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #34
                        > Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or **depicts** persons under the designated age of majority. [...] > Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and **computer-generated images** that appear to involve them. (Emphasis mine) https://en.wikipedia.org/wiki/Child_pornography
                        M 1 Reply Last reply
                        0
                        • ? Guest
                          > Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or **depicts** persons under the designated age of majority. [...] > Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and **computer-generated images** that appear to involve them. (Emphasis mine) https://en.wikipedia.org/wiki/Child_pornography
                          M This user is from outside of this forum
                          M This user is from outside of this forum
                          mindbleach@sh.itjust.works
                          wrote last edited by
                          #35
                          'These several things are illegal, including the real thing and several made-up things.' Please stop misusing the term that explicitly refers to the the real thing. 'No.'
                          1 Reply Last reply
                          1
                          0
                          • D deranger@sh.itjust.works
                            >The \*entire point\* of the term CSAM is that it's the actual real evidence of child rape. You are completely wrong. https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/ “Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.”
                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            mindbleach@sh.itjust.works
                            wrote last edited by
                            #36
                            RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.
                            D 1 Reply Last reply
                            0
                            • M mindbleach@sh.itjust.works
                              I am going to be an absolute crank about this: CSAM means photographic evidence of child rape. If that event did not happen, say something else. The *entire point* of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are *real,* they're not CSAM. Say something else. Otherwise we'll have to invent some even less ambiguous term from *evidence of child abuse,* and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.
                              ? Offline
                              ? Offline
                              Guest
                              wrote last edited by
                              #37
                              From the article: > The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as "evidence of child sexual abuse" that "includes both real and synthetic content, such as images created with artificial intelligence tools."
                              M 1 Reply Last reply
                              1
                              0
                              • ? Guest
                                From the article: > The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as "evidence of child sexual abuse" that "includes both real and synthetic content, such as images created with artificial intelligence tools."
                                M This user is from outside of this forum
                                M This user is from outside of this forum
                                mindbleach@sh.itjust.works
                                wrote last edited by
                                #38
                                They're wrong. As evidenced by *what those words mean.*
                                1 Reply Last reply
                                1
                                0
                                • M mindbleach@sh.itjust.works
                                  RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.
                                  D This user is from outside of this forum
                                  D This user is from outside of this forum
                                  deranger@sh.itjust.works
                                  wrote last edited by
                                  #39
                                  Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.
                                  1 Reply Last reply
                                  1
                                  0
                                  • D deranger@sh.itjust.works
                                    CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.
                                    M This user is from outside of this forum
                                    M This user is from outside of this forum
                                    mindbleach@sh.itjust.works
                                    wrote last edited by
                                    #40
                                    CSAM is material.... from the sexual abuse... of a child. Fiction does not count.
                                    D ? 2 Replies Last reply
                                    1
                                    0
                                    • M mindbleach@sh.itjust.works
                                      There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean 'shit what looks like it could be from the abuse of some child I guess.' It means, state's evidence of actual crimes.
                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #41
                                      It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what's happened. These kids did not consent to have their likeness sexualised.
                                      M 1 Reply Last reply
                                      0
                                      • ? Guest
                                        It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what's happened. These kids did not consent to have their likeness sexualised.
                                        M This user is from outside of this forum
                                        M This user is from outside of this forum
                                        mindbleach@sh.itjust.works
                                        wrote last edited by
                                        #42
                                        Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
                                        ? ? ? 3 Replies Last reply
                                        1
                                        0
                                        • M mindbleach@sh.itjust.works
                                          Nothing done to your *likeness* is a thing that happened *to you.* Do you people not understand reality is different from fiction?
                                          ? Offline
                                          ? Offline
                                          Guest
                                          wrote last edited by
                                          #43
                                          My likeness posted for the world to see in a way i did not consent to is a thing done to me
                                          M 1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 7
                                          • 8
                                          • 9
                                          • 10
                                          • 11
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups