Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...

Scheduled Pinned Locked Moved Uncategorized
252 Posts 158 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Kevin BeaumontG Kevin Beaumont

    I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.

    ? Offline
    ? Offline
    Guest
    wrote last edited by
    #192

    @GossiTheDog Intereating that the far right, who claim they want to "protect women and children", are silent too.

    1 Reply Last reply
    0
    • ? Guest
      @GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #193

      @Steve @GossiTheDog They punch down

      1 Reply Last reply
      0
      • ? Guest

        @Steve @GossiTheDog what payment processors does X use? Seems like something they would need to be pushed on.

        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #194

        @ikuturso @Steve @GossiTheDog Sex workers are heavily censored by Visa and MC. Not just visuals, but speech. There are entire lists of words and subjects that a phone sex operator cannot talk about, or they won't have payment processing anymore (and I've worked for companies that got shut down over it). But men undressing and brutalizing children is fine.

        Phone sex ops in 2008 (not sure if it has changed) couldn't talk about witchcraft as pretend, because what if she was really using it?

        1 Reply Last reply
        0
        • ? Guest

          @cstross @futurebird @GossiTheDog how is the llc giving these people a fig leaf for overt criminal, morally reprehensible, actions?

          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #195

          @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

          1) CSAM is not its primary purpose
          2) It's large and has lots of money
          3) well known figures are on it

          If it was a small server hosted in the UK it would already have been taken down.

          It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

          I'd also note that if CSAM can be located the first action is typically to seize all the computers.

          Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

          phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

          Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

          myrmepropagandistF FurbyF Susanna the Artist 🌻S 4 Replies Last reply
          1
          0
          • ? Guest
            @GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
            Henryk PlötzH This user is from outside of this forum
            Henryk PlötzH This user is from outside of this forum
            Henryk Plötz
            wrote last edited by
            #196

            @Steve @GossiTheDog *Especially* since Elon has made it clear that the CSAM generator now is a for-profit feature. Blocking grok/X payments would be appropriate.

            1 Reply Last reply
            0
            • Kevin BeaumontG Kevin Beaumont

              The UK government says the move by X to limit Grok to paid users is “insulting” and basically monetising abuse, and they would support a ban of X in the UK if recommended by the regulator. They’ve asked the regulator for recommendations in days. https://www.bbc.co.uk/news/articles/c99kn52nx9do

              ? Offline
              ? Offline
              Guest
              wrote last edited by
              #197

              @GossiTheDog
              What a shame the UK gvmt is unable to make up its own mind

              1 Reply Last reply
              0
              • Kevin BeaumontG Kevin Beaumont

                Some Grok users, mostly men, began to demand to see bruising on the bodies of the women, and for blood to be added to the images. Requests to show women tied up and gagged were instantly granted.

                ‘Add blood, forced smile’

                Link Preview Image
                ‘Add blood, forced smile’: how Grok’s nudification tool went viral

                The ‘put her in a bikini’ trend rapidly evolved into hundreds of thousands of requests to strip clothes from photos of women, horrifying those targeted

                favicon

                the Guardian (www.theguardian.com)

                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #198

                @GossiTheDog

                WTAF??😮

                1 Reply Last reply
                0
                • ? Guest

                  @GossiTheDog why, if “all party leaders commented” did the BBC only quote the Prime Minister (fair enough) and Farage, rather than, say, the Conservative Party leader? Or anyone else?

                  Is the BBC biased to giving more exposure to Reform than it deserves? From here, it kinda looks like it. Why? Who is pushing the BBC to do this?

                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #199

                  @GentlemanTech @GossiTheDog Yes.

                  Link Preview Image
                  Reform, with 0.7% of all MPs, featured in 25% of BBC's recent 10pm news bulletins

                  Astonishingly, a party with 18 times as many MPs as Reform has received one-third less coverage on BBC News at 10.

                  favicon

                  The London Economic (www.thelondoneconomic.com)

                  1 Reply Last reply
                  0
                  • ? Guest

                    @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

                    1) CSAM is not its primary purpose
                    2) It's large and has lots of money
                    3) well known figures are on it

                    If it was a small server hosted in the UK it would already have been taken down.

                    It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

                    I'd also note that if CSAM can be located the first action is typically to seize all the computers.

                    Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

                    phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

                    Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

                    myrmepropagandistF This user is from outside of this forum
                    myrmepropagandistF This user is from outside of this forum
                    myrmepropagandist
                    wrote last edited by
                    #200

                    @syllopsium @Seruko @cstross @GossiTheDog

                    Is this true for “fictional” CSAM ? Or are there some loopholes?

                    I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                    Never mind that we are talking about NOT your kid. And you never asked for it.

                    FurbyF Charlie StrossC ? ? 4 Replies Last reply
                    0
                    • myrmepropagandistF myrmepropagandist shared this topic
                    • ? Guest

                      @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

                      1) CSAM is not its primary purpose
                      2) It's large and has lots of money
                      3) well known figures are on it

                      If it was a small server hosted in the UK it would already have been taken down.

                      It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

                      I'd also note that if CSAM can be located the first action is typically to seize all the computers.

                      Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

                      phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

                      Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

                      myrmepropagandistF This user is from outside of this forum
                      myrmepropagandistF This user is from outside of this forum
                      myrmepropagandist
                      wrote last edited by
                      #201

                      @syllopsium @Seruko @cstross @GossiTheDog

                      The law needs to catch up on privacy rights. Especially in the US.

                      1 Reply Last reply
                      0
                      • myrmepropagandistF myrmepropagandist

                        @syllopsium @Seruko @cstross @GossiTheDog

                        Is this true for “fictional” CSAM ? Or are there some loopholes?

                        I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                        Never mind that we are talking about NOT your kid. And you never asked for it.

                        FurbyF This user is from outside of this forum
                        FurbyF This user is from outside of this forum
                        Furby
                        wrote last edited by
                        #202

                        @futurebird there's a new British law about deepfakes but it's not in force yet https://www.bbc.co.uk/news/articles/cp9jxvx0zjmo

                        1 Reply Last reply
                        0
                        • ? Guest

                          @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

                          1) CSAM is not its primary purpose
                          2) It's large and has lots of money
                          3) well known figures are on it

                          If it was a small server hosted in the UK it would already have been taken down.

                          It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

                          I'd also note that if CSAM can be located the first action is typically to seize all the computers.

                          Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

                          phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

                          Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

                          FurbyF This user is from outside of this forum
                          FurbyF This user is from outside of this forum
                          Furby
                          wrote last edited by
                          #203

                          @syllopsium British journalists LOVE twitter.

                          1 Reply Last reply
                          0
                          • myrmepropagandistF myrmepropagandist

                            @syllopsium @Seruko @cstross @GossiTheDog

                            Is this true for “fictional” CSAM ? Or are there some loopholes?

                            I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                            Never mind that we are talking about NOT your kid. And you never asked for it.

                            Charlie StrossC This user is from outside of this forum
                            Charlie StrossC This user is from outside of this forum
                            Charlie Stross
                            wrote last edited by
                            #204

                            @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

                            Madeleine MorrisR Ben Lubar (any pronouns)B myrmepropagandistF 3 Replies Last reply
                            0
                            • ? Guest

                              @Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is

                              1) CSAM is not its primary purpose
                              2) It's large and has lots of money
                              3) well known figures are on it

                              If it was a small server hosted in the UK it would already have been taken down.

                              It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

                              I'd also note that if CSAM can be located the first action is typically to seize all the computers.

                              Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

                              phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

                              Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.

                              Susanna the Artist 🌻S This user is from outside of this forum
                              Susanna the Artist 🌻S This user is from outside of this forum
                              Susanna the Artist 🌻
                              wrote last edited by
                              #205

                              @syllopsium Consider: every single US government department, no matter how small, still has an account because they don’t want to draw attention to themselves by leaving or not having one.

                              1 Reply Last reply
                              0
                              • Charlie StrossC Charlie Stross

                                @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

                                Madeleine MorrisR This user is from outside of this forum
                                Madeleine MorrisR This user is from outside of this forum
                                Madeleine Morris
                                wrote last edited by
                                #206

                                @cstross It is illegal in the US also.

                                Link Preview Image
                                United States v. Handley - Wikipedia

                                favicon

                                (en.wikipedia.org)

                                Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.

                                @futurebird @syllopsium @Seruko @GossiTheDog

                                1 Reply Last reply
                                0
                                • myrmepropagandistF myrmepropagandist

                                  @syllopsium @Seruko @cstross @GossiTheDog

                                  Is this true for “fictional” CSAM ? Or are there some loopholes?

                                  I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                                  Never mind that we are talking about NOT your kid. And you never asked for it.

                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #207

                                  @futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:

                                  (7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
                                  (8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.

                                  ? 1 Reply Last reply
                                  1
                                  0
                                  • ? Guest

                                    @futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:

                                    (7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
                                    (8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.

                                    ? Offline
                                    ? Offline
                                    Guest
                                    wrote last edited by
                                    #208

                                    @futurebird @GossiTheDog @Seruko @cstross @syllopsium (drawings are covered by the Coroners and Justice Act 2009, which has lower penalties but it’s still a criminal offence)

                                    1 Reply Last reply
                                    0
                                    • myrmepropagandistF myrmepropagandist

                                      @syllopsium @Seruko @cstross @GossiTheDog

                                      Is this true for “fictional” CSAM ? Or are there some loopholes?

                                      I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

                                      Never mind that we are talking about NOT your kid. And you never asked for it.

                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #209

                                      @futurebird @Seruko @cstross @GossiTheDog as far as CSAM goes, no, there's no loophole. Thank goodness. It is a Strict Liability offence as Charlie notes, which means you can't go 'oops, how did that get there?' - if it's on your system, you get arrested.

                                      There's not only quite correctly no defense under UK for predatory and non consensual material such as CSAM, but also increasingly no defense for some more extreme activities even between fully consenting and adult participants. If you're into heavy duty SM or strangulation, it would be a very good idea not to have any seizable material, and where possible limit your indulgence.

                                      1 Reply Last reply
                                      0
                                      • Kevin BeaumontG Kevin Beaumont

                                        I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.

                                        ? Offline
                                        ? Offline
                                        Guest
                                        wrote last edited by
                                        #210

                                        @GossiTheDog the answer is simple- are aren’t any women on the internet, duh! Just us geeks and dweebs. They’re all busy at… er… the ice cream parlor and salon?

                                        1 Reply Last reply
                                        0
                                        • Charlie StrossC Charlie Stross

                                          @futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)

                                          Ben Lubar (any pronouns)B This user is from outside of this forum
                                          Ben Lubar (any pronouns)B This user is from outside of this forum
                                          Ben Lubar (any pronouns)
                                          wrote last edited by
                                          #211

                                          @cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.

                                          the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.

                                          ? 1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 9
                                          • 10
                                          • 11
                                          • 12
                                          • 13
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups