I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...
-
I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.
@GossiTheDog Intereating that the far right, who claim they want to "protect women and children", are silent too.
-
@GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
-
@Steve @GossiTheDog what payment processors does X use? Seems like something they would need to be pushed on.
@ikuturso @Steve @GossiTheDog Sex workers are heavily censored by Visa and MC. Not just visuals, but speech. There are entire lists of words and subjects that a phone sex operator cannot talk about, or they won't have payment processing anymore (and I've worked for companies that got shut down over it). But men undressing and brutalizing children is fine.
Phone sex ops in 2008 (not sure if it has changed) couldn't talk about witchcraft as pretend, because what if she was really using it?
-
@cstross @futurebird @GossiTheDog how is the llc giving these people a fig leaf for overt criminal, morally reprehensible, actions?
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
-
@GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
@Steve @GossiTheDog *Especially* since Elon has made it clear that the CSAM generator now is a for-profit feature. Blocking grok/X payments would be appropriate.
-
The UK government says the move by X to limit Grok to paid users is “insulting” and basically monetising abuse, and they would support a ban of X in the UK if recommended by the regulator. They’ve asked the regulator for recommendations in days. https://www.bbc.co.uk/news/articles/c99kn52nx9do
@GossiTheDog
What a shame the UK gvmt is unable to make up its own mind -
Some Grok users, mostly men, began to demand to see bruising on the bodies of the women, and for blood to be added to the images. Requests to show women tied up and gagged were instantly granted.
‘Add blood, forced smile’
‘Add blood, forced smile’: how Grok’s nudification tool went viral
The ‘put her in a bikini’ trend rapidly evolved into hundreds of thousands of requests to strip clothes from photos of women, horrifying those targeted
the Guardian (www.theguardian.com)
WTAF??

-
@GossiTheDog why, if “all party leaders commented” did the BBC only quote the Prime Minister (fair enough) and Farage, rather than, say, the Conservative Party leader? Or anyone else?
Is the BBC biased to giving more exposure to Reform than it deserves? From here, it kinda looks like it. Why? Who is pushing the BBC to do this?
@GentlemanTech @GossiTheDog Yes.
Reform, with 0.7% of all MPs, featured in 25% of BBC's recent 10pm news bulletins
Astonishingly, a party with 18 times as many MPs as Reform has received one-third less coverage on BBC News at 10.
The London Economic (www.thelondoneconomic.com)
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
-
F myrmepropagandist shared this topic
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium @Seruko @cstross @GossiTheDog
The law needs to catch up on privacy rights. Especially in the US.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird there's a new British law about deepfakes but it's not in force yet https://www.bbc.co.uk/news/articles/cp9jxvx0zjmo
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium British journalists LOVE twitter.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium Consider: every single US government department, no matter how small, still has an account because they don’t want to draw attention to themselves by leaving or not having one.
-
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
@cstross It is illegal in the US also.
Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:
(7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
(8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult. -
@futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:
(7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
(8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.@futurebird @GossiTheDog @Seruko @cstross @syllopsium (drawings are covered by the Coroners and Justice Act 2009, which has lower penalties but it’s still a criminal offence)
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @Seruko @cstross @GossiTheDog as far as CSAM goes, no, there's no loophole. Thank goodness. It is a Strict Liability offence as Charlie notes, which means you can't go 'oops, how did that get there?' - if it's on your system, you get arrested.
There's not only quite correctly no defense under UK for predatory and non consensual material such as CSAM, but also increasingly no defense for some more extreme activities even between fully consenting and adult participants. If you're into heavy duty SM or strangulation, it would be a very good idea not to have any seizable material, and where possible limit your indulgence.
-
I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.
@GossiTheDog the answer is simple- are aren’t any women on the internet, duh! Just us geeks and dweebs. They’re all busy at… er… the ice cream parlor and salon?
-
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
@cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.
the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.