Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'

Scheduled Pinned Locked Moved Uncategorized
games
218 Posts 110 Posters 132 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ? Guest
    Isn't it abuse if i take a picture of a girl, let grok remove the clothes and post this online?
    M This user is from outside of this forum
    M This user is from outside of this forum
    mindbleach@sh.itjust.works
    wrote last edited by
    #53
    It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
    MaestroM ? ? 3 Replies Last reply
    1
    0
    • M mindbleach@sh.itjust.works
      Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #54
      > The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as "evidence of child sexual abuse" that "includes both real and synthetic content Were you too busy fapping to read the article?
      1 Reply Last reply
      1
      0
      • M mindbleach@sh.itjust.works
        Nothing made-up is CSAM. That is the entire point of the term "CSAM." It's like calling a horror movie murder.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #55
        AI CSAM was generated from real CSAM AI being able to accurately undress kids is a real issue in multiple ways
        M 1 Reply Last reply
        1
        0
        • ? Guest
          I'll keep in mind Tim think child porn is just politics.
          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #56
          I mean the capitalist are the ones calling the shots since the imperial core is no democracy. This is their battle we are their dildos.
          1 Reply Last reply
          0
          • M mindbleach@sh.itjust.works
            There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #57
            > The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as "evidence of child sexual abuse" that "includes both real and synthetic content
            1 Reply Last reply
            1
            0
            • ? Guest
              AI CSAM was generated from real CSAM AI being able to accurately undress kids is a real issue in multiple ways
              M This user is from outside of this forum
              M This user is from outside of this forum
              mindbleach@sh.itjust.works
              wrote last edited by
              #58
              AI can draw Shrek on the moon. Do you think it needed real images of that?
              ? 1 Reply Last reply
              1
              0
              • ? Guest
                It's too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #59
                I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
                shani66@ani.socialS ? ? 3 Replies Last reply
                1
                0
                • M mindbleach@sh.itjust.works
                  A threat of murder is a crime without being the same thing as murder. Meditate on this.
                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #60
                  And abuse is a different word than rape. Maybe meditate on that
                  M 1 Reply Last reply
                  0
                  • ? Guest
                    And abuse is a different word than rape. Maybe meditate on that
                    M This user is from outside of this forum
                    M This user is from outside of this forum
                    mindbleach@sh.itjust.works
                    wrote last edited by
                    #61
                    And what if neither happened?
                    ? 1 Reply Last reply
                    0
                    • ? Guest
                      I get this and I don't disagree, but I also hate that AI fully brought back thought crimes as a thing. I don't have a better approach or idea, but I really don't like that simply drawing a certain arrangement of lines and colors is now a crime. I've also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated. Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
                      shani66@ani.socialS This user is from outside of this forum
                      shani66@ani.socialS This user is from outside of this forum
                      shani66@ani.social
                      wrote last edited by
                      #62
                      Sure, i think it's weird to really care about loli or furry or any other *niche*, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can't have effective safeguards against that harm it makes sense to restrict it legally.
                      ? 1 Reply Last reply
                      0
                      • M mindbleach@sh.itjust.works
                        It's a crime, but it's not the same crime as taking the actual clothes off the actual girl. She was not physically abused. She was not even *involved.*
                        MaestroM This user is from outside of this forum
                        MaestroM This user is from outside of this forum
                        Maestro
                        wrote last edited by
                        #63
                        When all her friends and family see that image, she is definitely involved. And it's definitely abuse.
                        1 Reply Last reply
                        1
                        0
                        • M mindbleach@sh.itjust.works
                          AI can draw Shrek on the moon. Do you think it needed real images of that?
                          ? Offline
                          ? Offline
                          Guest
                          wrote last edited by
                          #64
                          It used real images of shrek and the moon to do that. It didnt "invent" or "imagine" either. The child porn it's generating is based on literal child porn, if not itself just actual child porn.
                          M 1 Reply Last reply
                          0
                          • ? Guest
                            It used real images of shrek and the moon to do that. It didnt "invent" or "imagine" either. The child porn it's generating is based on literal child porn, if not itself just actual child porn.
                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            mindbleach@sh.itjust.works
                            wrote last edited by
                            #65
                            You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
                            S ? ? 3 Replies Last reply
                            1
                            0
                            • M This user is from outside of this forum
                              M This user is from outside of this forum
                              mindbleach@sh.itjust.works
                              wrote last edited by
                              #66
                              Does a depiction of her corpse mean she's dead?
                              C 1 Reply Last reply
                              1
                              0
                              • M mindbleach@sh.itjust.works
                                You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
                                S This user is from outside of this forum
                                S This user is from outside of this forum
                                stray@pawb.social
                                wrote last edited by
                                #67
                                It literally can't combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn't make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that's generally how we prefer to photograph wine. It has no concept of "full" the way actual intelligences do, so it couldn't connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.
                                M 1 Reply Last reply
                                1
                                0
                                • R retrogoblet79@eviltoast.org
                                  This post did not contain any content.
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #68
                                  >steam >does nothing >wins
                                  1 Reply Last reply
                                  0
                                  • S stray@pawb.social
                                    It literally can't combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn't make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that's generally how we prefer to photograph wine. It has no concept of "full" the way actual intelligences do, so it couldn't connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.
                                    M This user is from outside of this forum
                                    M This user is from outside of this forum
                                    mindbleach@sh.itjust.works
                                    wrote last edited by
                                    #69
                                    And you think it's short on images of fully naked women?
                                    S 1 Reply Last reply
                                    1
                                    0
                                    • R retrogoblet79@eviltoast.org
                                      This post did not contain any content.
                                      C This user is from outside of this forum
                                      C This user is from outside of this forum
                                      cancermancer@sh.itjust.works
                                      wrote last edited by
                                      #70
                                      Guy atomically made of shit takes has another shit take, colour me surprised.
                                      ? 1 Reply Last reply
                                      0
                                      • M mindbleach@sh.itjust.works
                                        Does a depiction of her corpse mean she's dead?
                                        C This user is from outside of this forum
                                        C This user is from outside of this forum
                                        cancermancer@sh.itjust.works
                                        wrote last edited by
                                        #71
                                        False equivalence.
                                        M 1 Reply Last reply
                                        1
                                        0
                                        • C cancermancer@sh.itjust.works
                                          False equivalence.
                                          M This user is from outside of this forum
                                          M This user is from outside of this forum
                                          mindbleach@sh.itjust.works
                                          wrote last edited by
                                          #72
                                          The central goddamn point. *Depicting* things happening does not mean they *actually happened.* The entire point of the term CSAM is to describe crimes which literally occurred. It is material... from the sexual abuse... of children. Do y'all not understand why it's kinda fuckin' important to have a term for that specific concept?
                                          C 1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 3
                                          • 4
                                          • 5
                                          • 6
                                          • 7
                                          • 8
                                          • 9
                                          • 10
                                          • 11
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups