Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...

Scheduled Pinned Locked Moved Uncategorized
252 Posts 158 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ? Guest

    @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

    1) CSAM is not its primary purpose
    2) It's large and has lots of money
    3) well known figures are on it

    If it was a small server hosted in the UK it would already have been taken down.

    It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

    I'd also note that if CSAM can be located the first action is typically to seize all the computers.

    Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

    phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

    Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

    myrmepropagandistF This user is from outside of this forum
    myrmepropagandistF This user is from outside of this forum
    myrmepropagandist
    wrote last edited by
    #201

    @syllopsium @Seruko @cstross @GossiTheDog

    The law needs to catch up on privacy rights. Especially in the US.

    1 Reply Last reply
    0
    • myrmepropagandistF myrmepropagandist

      @syllopsium @Seruko @cstross @GossiTheDog

      Is this true for “fictional” CSAM ? Or are there some loopholes?

      I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

      Never mind that we are talking about NOT your kid. And you never asked for it.

      FurbyF This user is from outside of this forum
      FurbyF This user is from outside of this forum
      Furby
      wrote last edited by
      #202

      @futurebird there's a new British law about deepfakes but it's not in force yet https://www.bbc.co.uk/news/articles/cp9jxvx0zjmo

      1 Reply Last reply
      0
      • ? Guest

        @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

        1) CSAM is not its primary purpose
        2) It's large and has lots of money
        3) well known figures are on it

        If it was a small server hosted in the UK it would already have been taken down.

        It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

        I'd also note that if CSAM can be located the first action is typically to seize all the computers.

        Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

        phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

        Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

        FurbyF This user is from outside of this forum
        FurbyF This user is from outside of this forum
        Furby
        wrote last edited by
        #203

        @syllopsium British journalists LOVE twitter.

        1 Reply Last reply
        0
        • myrmepropagandistF myrmepropagandist

          @syllopsium @Seruko @cstross @GossiTheDog

          Is this true for “fictional” CSAM ? Or are there some loopholes?

          I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

          Never mind that we are talking about NOT your kid. And you never asked for it.

          Charlie StrossC This user is from outside of this forum
          Charlie StrossC This user is from outside of this forum
          Charlie Stross
          wrote last edited by
          #204

          @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

          Madeleine MorrisR Ben Lubar (any pronouns)B myrmepropagandistF 3 Replies Last reply
          0
          • ? Guest

            @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

            1) CSAM is not its primary purpose
            2) It's large and has lots of money
            3) well known figures are on it

            If it was a small server hosted in the UK it would already have been taken down.

            It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

            I'd also note that if CSAM can be located the first action is typically to seize all the computers.

            Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

            phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

            Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

            Susanna the Artist 🌻S This user is from outside of this forum
            Susanna the Artist 🌻S This user is from outside of this forum
            Susanna the Artist 🌻
            wrote last edited by
            #205

            @syllopsium Consider: every single US government department, no matter how small, still has an account because they don’t want to draw attention to themselves by leaving or not having one.

            1 Reply Last reply
            0
            • Charlie StrossC Charlie Stross

              @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

              Madeleine MorrisR This user is from outside of this forum
              Madeleine MorrisR This user is from outside of this forum
              Madeleine Morris
              wrote last edited by
              #206

              @cstross It is illegal in the US also.

              Link Preview Image
              United States v. Handley - Wikipedia

              favicon

              (en.wikipedia.org)

              Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.

              @futurebird @syllopsium @Seruko @GossiTheDog

              1 Reply Last reply
              0
              • myrmepropagandistF myrmepropagandist

                @syllopsium @Seruko @cstross @GossiTheDog

                Is this true for “fictional” CSAM ? Or are there some loopholes?

                I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                Never mind that we are talking about NOT your kid. And you never asked for it.

                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #207

                @futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:

                (7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
                (8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.

                ? 1 Reply Last reply
                1
                0
                • ? Guest

                  @futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:

                  (7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
                  (8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.

                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #208

                  @futurebird @GossiTheDog @Seruko @cstross @syllopsium (drawings are covered by the Coroners and Justice Act 2009, which has lower penalties but it’s still a criminal offence)

                  1 Reply Last reply
                  0
                  • myrmepropagandistF myrmepropagandist

                    @syllopsium @Seruko @cstross @GossiTheDog

                    Is this true for “fictional” CSAM ? Or are there some loopholes?

                    I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                    Never mind that we are talking about NOT your kid. And you never asked for it.

                    ? Offline
                    ? Offline
                    Guest
                    wrote last edited by
                    #209

                    @futurebird @Seruko @cstross @GossiTheDog as far as CSAM goes, no, there's no loophole. Thank goodness. It is a Strict Liability offence as Charlie notes, which means you can't go 'oops, how did that get there?' - if it's on your system, you get arrested.

                    There's not only quite correctly no defense under UK for predatory and non consensual material such as CSAM, but also increasingly no defense for some more extreme activities even between fully consenting and adult participants. If you're into heavy duty SM or strangulation, it would be a very good idea not to have any seizable material, and where possible limit your indulgence.

                    1 Reply Last reply
                    0
                    • Kevin BeaumontG Kevin Beaumont

                      I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.

                      ? Offline
                      ? Offline
                      Guest
                      wrote last edited by
                      #210

                      @GossiTheDog the answer is simple- are aren’t any women on the internet, duh! Just us geeks and dweebs. They’re all busy at… er… the ice cream parlor and salon?

                      1 Reply Last reply
                      0
                      • Charlie StrossC Charlie Stross

                        @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

                        Ben Lubar (any pronouns)B This user is from outside of this forum
                        Ben Lubar (any pronouns)B This user is from outside of this forum
                        Ben Lubar (any pronouns)
                        wrote last edited by
                        #211

                        @cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.

                        the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.

                        ? 1 Reply Last reply
                        0
                        • Ben Lubar (any pronouns)B Ben Lubar (any pronouns)

                          @cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.

                          the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.

                          ? Offline
                          ? Offline
                          Guest
                          wrote last edited by
                          #212

                          @ben @cstross @futurebird @GossiTheDog I would be very wary of dealing in absolute statements like that when it could possibly be derived from depicting people as naked and extrapolating from there.

                          That's a hypothetical to be dealt with by law enforcement. What isn't hypothetical is that new, illegal content has been generated.

                          What is an interesting, related question associated with yours, is if illegal content has been generated let's say from entirely innocent training data via extrapolation, is the generated content then fed back to the training set? Because at that point the training set *is* polluted, and how is it going to be cleaned?

                          Ben Lubar (any pronouns)B 1 Reply Last reply
                          0
                          • ? Guest
                            @GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
                            ? Offline
                            ? Offline
                            Guest
                            wrote last edited by
                            #213

                            @Steve @GossiTheDog that's what autocracy looks like

                            1 Reply Last reply
                            0
                            • ? Guest

                              @ben @cstross @futurebird @GossiTheDog I would be very wary of dealing in absolute statements like that when it could possibly be derived from depicting people as naked and extrapolating from there.

                              That's a hypothetical to be dealt with by law enforcement. What isn't hypothetical is that new, illegal content has been generated.

                              What is an interesting, related question associated with yours, is if illegal content has been generated let's say from entirely innocent training data via extrapolation, is the generated content then fed back to the training set? Because at that point the training set *is* polluted, and how is it going to be cleaned?

                              Ben Lubar (any pronouns)B This user is from outside of this forum
                              Ben Lubar (any pronouns)B This user is from outside of this forum
                              Ben Lubar (any pronouns)
                              wrote last edited by
                              #214

                              @syllopsium @cstross @futurebird @GossiTheDog With the way machine learning works, you can't remove something that's already been fed into it. You can only restore a backup from before it was fed in, or if that's not possible, start over from completely random weights and be more careful with your selection of training data.

                              myrmepropagandistF 1 Reply Last reply
                              0
                              • Kevin BeaumontG Kevin Beaumont

                                Before we get to the staff members at cyber companies, the Financial Times has the staff at X and xAI. https://www.ft.com/content/ad94db4c-95a0-4c65-bd8d-3b43e1251091

                                ? Offline
                                ? Offline
                                Guest
                                wrote last edited by
                                #215

                                @GossiTheDog

                                The job interview:
                                "I see on this circus poster that you've recently been a sex-abuse-production clown. Where do you feel you'd fit into our company?"

                                1 Reply Last reply
                                0
                                • Erik AblesonE Erik Ableson

                                  @Steve @GossiTheDog @cthulku Also: are VISA & MC still advertising on X?

                                  CassandrichD This user is from outside of this forum
                                  CassandrichD This user is from outside of this forum
                                  Cassandrich
                                  wrote last edited by
                                  #216

                                  @erik @Steve @GossiTheDog @cthulku Or processing payments for bluechecks (and now premium CSAM generation services) on it? 🤔

                                  1 Reply Last reply
                                  0
                                  • Charlie StrossC Charlie Stross

                                    @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

                                    myrmepropagandistF This user is from outside of this forum
                                    myrmepropagandistF This user is from outside of this forum
                                    myrmepropagandist
                                    wrote last edited by
                                    #217

                                    @cstross

                                    Could US law cover the *distribution* of such material?

                                    Right now we have a popular “top ten” phone app with this garbage. Worse than if it were broadcast on public airwaves or put up on a poster in the public square.

                                    This is a worst case scenario from bad internet debates about porn, gore and obscenity laws come to life.

                                    And I feel like the worst creeps I’ve ever known are whispering “actually it’s called eubuphila” as if that were a serious argument.

                                    GraydonG Dave Wilburn :donor:D myrmepropagandistF 3 Replies Last reply
                                    1
                                    0
                                    • Ben Lubar (any pronouns)B Ben Lubar (any pronouns)

                                      @syllopsium @cstross @futurebird @GossiTheDog With the way machine learning works, you can't remove something that's already been fed into it. You can only restore a backup from before it was fed in, or if that's not possible, start over from completely random weights and be more careful with your selection of training data.

                                      myrmepropagandistF This user is from outside of this forum
                                      myrmepropagandistF This user is from outside of this forum
                                      myrmepropagandist
                                      wrote last edited by
                                      #218

                                      @ben @syllopsium @cstross @GossiTheDog

                                      Delete grok

                                      1 Reply Last reply
                                      0
                                      • ? Guest
                                        @GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
                                        ? Offline
                                        ? Offline
                                        Guest
                                        wrote last edited by
                                        #219

                                        @Steve @ErickaSimone @GossiTheDog They are angry at women making money off their own bodies. Men creating nudes of women without their consent doesn’t bring any material aid to women, so it’s ok

                                        1 Reply Last reply
                                        0
                                        • ? Guest
                                          @GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
                                          Jonathan Kamens 86 47J This user is from outside of this forum
                                          Jonathan Kamens 86 47J This user is from outside of this forum
                                          Jonathan Kamens 86 47
                                          wrote last edited by
                                          #220

                                          @Steve @GossiTheDog Sure, if by "fascinating" you mean "entirely unsurprising and completely expected."
                                          This is patriarchy + complicity toward fascism in action.

                                          1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 9
                                          • 10
                                          • 11
                                          • 12
                                          • 13
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups