I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise w...
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium @Seruko @cstross @GossiTheDog
The law needs to catch up on privacy rights. Especially in the US.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird there's a new British law about deepfakes but it's not in force yet https://www.bbc.co.uk/news/articles/cp9jxvx0zjmo
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium British journalists LOVE twitter.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
-
@Seruko @cstross @futurebird @GossiTheDog It isn't. Call me cynical, but the only reason X is still standing is
1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on itIf it was a small server hosted in the UK it would already have been taken down.
It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.
I'd also note that if CSAM can be located the first action is typically to seize all the computers.
Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.
phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.
Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. :(.
@syllopsium Consider: every single US government department, no matter how small, still has an account because they don’t want to draw attention to themselves by leaving or not having one.
-
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
@cstross It is illegal in the US also.
Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:
(7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
(8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult. -
@futurebird @syllopsium @Seruko @cstross @GossiTheDog In the UK since 1995 it’s an unequivocal yes:
(7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
(8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.@futurebird @GossiTheDog @Seruko @cstross @syllopsium (drawings are covered by the Coroners and Justice Act 2009, which has lower penalties but it’s still a criminal offence)
-
@syllopsium @Seruko @cstross @GossiTheDog
Is this true for “fictional” CSAM ? Or are there some loopholes?
I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.
Never mind that we are talking about NOT your kid. And you never asked for it.
@futurebird @Seruko @cstross @GossiTheDog as far as CSAM goes, no, there's no loophole. Thank goodness. It is a Strict Liability offence as Charlie notes, which means you can't go 'oops, how did that get there?' - if it's on your system, you get arrested.
There's not only quite correctly no defense under UK for predatory and non consensual material such as CSAM, but also increasingly no defense for some more extreme activities even between fully consenting and adult participants. If you're into heavy duty SM or strangulation, it would be a very good idea not to have any seizable material, and where possible limit your indulgence.
-
I find it interesting that there's loads of people who made a core part of their identity campaigning against trans women being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.
@GossiTheDog the answer is simple- are aren’t any women on the internet, duh! Just us geeks and dweebs. They’re all busy at… er… the ice cream parlor and salon?
-
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
@cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.
the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.
-
@cstross @futurebird @syllopsium @GossiTheDog these aren't drawings, though. these are images generated by an algorithm that learns from existing images. which means there is definitely a significant amount of CSAM in the training set so it could "learn" how to "draw" sexualized children. and if the training set is just "any image posted on X", that means they're storing a significant amount of actual CSAM on their servers and not doing anything about it.
the only reason Twitter/X hasn't been raided by feds and forcibly shut down is that the CEO is friends with the president.
@ben @cstross @futurebird @GossiTheDog I would be very wary of dealing in absolute statements like that when it could possibly be derived from depicting people as naked and extrapolating from there.
That's a hypothetical to be dealt with by law enforcement. What isn't hypothetical is that new, illegal content has been generated.
What is an interesting, related question associated with yours, is if illegal content has been generated let's say from entirely innocent training data via extrapolation, is the generated content then fed back to the training set? Because at that point the training set *is* polluted, and how is it going to be cleaned?
-
@GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
-
@ben @cstross @futurebird @GossiTheDog I would be very wary of dealing in absolute statements like that when it could possibly be derived from depicting people as naked and extrapolating from there.
That's a hypothetical to be dealt with by law enforcement. What isn't hypothetical is that new, illegal content has been generated.
What is an interesting, related question associated with yours, is if illegal content has been generated let's say from entirely innocent training data via extrapolation, is the generated content then fed back to the training set? Because at that point the training set *is* polluted, and how is it going to be cleaned?
@syllopsium @cstross @futurebird @GossiTheDog With the way machine learning works, you can't remove something that's already been fed into it. You can only restore a backup from before it was fed in, or if that's not possible, start over from completely random weights and be more careful with your selection of training data.
-
Before we get to the staff members at cyber companies, the Financial Times has the staff at X and xAI. https://www.ft.com/content/ad94db4c-95a0-4c65-bd8d-3b43e1251091
The job interview:
"I see on this circus poster that you've recently been a sex-abuse-production clown. Where do you feel you'd fit into our company?" -
@Steve @GossiTheDog @cthulku Also: are VISA & MC still advertising on X?
@erik @Steve @GossiTheDog @cthulku Or processing payments for bluechecks (and now premium CSAM generation services) on it?

-
@futurebird @syllopsium @Seruko @GossiTheDog Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
Could US law cover the *distribution* of such material?
Right now we have a popular “top ten” phone app with this garbage. Worse than if it were broadcast on public airwaves or put up on a poster in the public square.
This is a worst case scenario from bad internet debates about porn, gore and obscenity laws come to life.
And I feel like the worst creeps I’ve ever known are whispering “actually it’s called eubuphila” as if that were a serious argument.
-
@syllopsium @cstross @futurebird @GossiTheDog With the way machine learning works, you can't remove something that's already been fed into it. You can only restore a backup from before it was fed in, or if that's not possible, start over from completely random weights and be more careful with your selection of training data.
@ben @syllopsium @cstross @GossiTheDog
Delete grok
-
@GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
@Steve @ErickaSimone @GossiTheDog They are angry at women making money off their own bodies. Men creating nudes of women without their consent doesn’t bring any material aid to women, so it’s ok
-
@GossiTheDog@cyberplace.social It's fascinating that payment processors and app stores happily bullied Tumblr over _female presenting nipples_ and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X
@Steve @GossiTheDog Sure, if by "fascinating" you mean "entirely unsurprising and completely expected."
This is patriarchy + complicity toward fascism in action.