Skip to main content


This is a massive problem, and the main reason I haven't tried to self host a #Fediverse instance on my own again.

The first question I ask is "how do I keep #CSAM off the instance" and nobody can answer that question.

https://www.theverge.com/2023/7/24/23806093/mastodon-csam-study-decentralized-network
@realcaseyrollinsThis is entirely the fault of the IWF and Microsoft, who create "exclusive" proprietary CSAM prevention software and then only license big tech companies to use it.
@realcaseyrollinsI forwarded this email chain to Gargron.
what are the odds they lobbied for the rules about hash sharing that they griveously have to follow
@realcaseyrollinsWhere is the New York Times article you mentioned? I can't follow this thread anymore.
@realcaseyrollinsAll of the emails I've sent so far today.
@realcaseyrollinsif they want more fuel they can reach out to me at graf@poast.org and ill tell them all about NCMEC
@realcaseyrollinsWait I forgot what happened with NCMEC? I thought they were one of the good ones. But they don't have a prevention mechanism, just a CyberTip API for after-the-fact.
The data protection rights of file hashes of child porn must be respected ~ IWF
So many acronyms. Why do people write like robots whose job is to stop the possibility of any work happening for the people that they work for?
Just forward the response to the verge so they can follow up on their article with how NCMEC and friends are actively impeding fixing the issue.
@realcaseyrollinsI sent The Verge 2 emails already, including this entire chain, yes.
> free public access is not a responsible option

the gigantic corp with infinite money prioritizes having more money over making available a tool that might actually help the world be better, cool
yooo i had the same problem with the IWF!!! my final conclusion came to this:

"The IWF takes a holier-than-thou stance when it comes to assisting smaller website operators in the universal mission to prevent the spread of child abuse material, possibly so they can operate an extortion racket or waste UK public funds."

https://infrablog.lain.la/iwf-nogo
i wonder how this will work on mastodon, considering that most people would probably shut this thing down o 90% of the instances if this ever gets implemented somehow

probably only instance admins with 5k+ users will get peer-pressured into enabling it by default
im talking about PhotoDNA ofc
there's nothing to enable it's entirely closed source behind a very large closed door (microsoft)

reversing it is possible (i have the DLL in question somewhere) but nobody's done it yet probably because you'd get raided to shit once they found out - and you may be undermining its effectiveness by doing so (although security by obscurity does nothing imo so this point is questionable at best)
@realcaseyrollinsI did my little bit of activism for today.
@realcaseyrollinsBy the way, assuming there's no policy change for the IWF and co, the technical solution to this problem is probably AI.
* start an organization to fight child abuse media
* organization gets big/influential
* organization is now a jobs program/institution
* organization needs child abuse to stay alive
@Moon@shitposter.club @7666@comp.lain.la @alex@gleasonator.com @realcaseyrollins@social.freetalklive.com

So, the iron law of bureaucracy strikes again.

:akko_tired:

https://www.jerrypournelle.com/reports/jerryp/iron.html
Remember when you said that everyone that uses Pleroma was benefiting from human trafficking, and you said that about Lain specifically as well? Do you own a mirror retard?
Man I sure wish there was better non-proprietary CSAM prevention software...
@realcaseyrollinsHow could you train such an AI without violating 2637864093 laws?
@realcaseyrollins1. Train it on nudity
2. Train it on children

Perform each training separately from the other. If the image scores high in both, flag it.
@realcaseyrollinsThis is by design. If you're not in their special club, they can shut you down at a moment's notice by uploading CP to your server and claiming that you are "hosting" it. Even if your moderators delete it in a minute, that's still enough for an article in Vice (ironic name) or Huffpo about how your service is a "haven for CSAM".
this. It's not about protecting kids, it's about controlling the rest of us.
This infuriates me to no end, everyone should be able to integrate CSAM prevention tech into their website if they want to but like you said, it's not about protecting the kids it is about having a method to shut you down and silence you if you aren't in the club
This does not scale to the whole Fediverse. Or barely even to a single node.
@realcaseyrollinsHow to start your own Fediverse server 101!

1. Get a VPS at a hosting provider like DigitalOcean. It will only cost you about $10 per month!

2. Pay £1,000+ to the IWF

3. ???
@realcaseyrollins> In the case of the hash list, while it is not possible to reverse engineer a hash, it is possible for bad actors to develop a crawler or other products that could use the hash set to find criminal images which they would otherwise not have knowledge or sight of.

Are they retarded? Hashing the data requires downloading the data, so they would have already obtained a copy of the CSAM. The only "utility" would be locating the CSAM in the data that they have previously scraped. And at that point they already have it...
@realcaseyrollins💯

If it exposed all the CSAM on the internet, somebody would do something about it, and that's actually a bigger threat to them than CSAM itself.
question: where did they get the source images to make the hashes, or the hashes themselves?
@realcaseyrollinsThey work directly with law enforcement.
@alex so... there's public money and time in this project...
@alex ok... picture this:

The OpenCSAM Consortium.
I was thinking something along these lines as well. A consortium of fedi administrators will pay the license, go through the painful regulatory shit, and then build an AP compatible tool that instances can use to cross-check their media upload hashes.
could also work like a distributed query system where you could get feedback from different hash tables hosted by different people (eg, no "single central authority", and the ability add objectionable materials (and tag the objection).)

There's no reason not to have several proposals...

... but paying sources (LE records/assembly time) seems reasonable: but fuck giving billware money for a necessary public service built on publicly funded data.
@realcaseyrollinsThis IWF guy is saying that the hashes themselves are illegal to posses. If true that would be very stupid and should be challenged in court, but would be an obvious problem to open sourcing the dataset (or any other entity open sourcing a dataset, for that matter).
It is apparently considered even MORE identifying than fully "anonymous data".

I think that is very wrong. Unless their database is not a hash database at all, but a facial recognition database (maybe it is), the hash itself is a "digital fingerprint" of _that specific image_, and not of the person involved.
@realcaseyrollinsThey think you're going to go around the entire internet computing hashes of every image in existence until you find matches. They're basically acknowledging that the CSAM exists online, and trying to cover it up.

But why does it exist? Is their organization inefficient at removing it?

Because if it were to be exposed in the very way they fear, someone would clean that shit up almost immediately. What they fear is someone having to do something to clean up child porn.
@realcaseyrollinsOn the contrary, imagine they open source the hash database. Every website in the world, for free, can prevent CSAM by adding a simple library to their upload form. Then CSAM essentially ceases to exist, because no website operator wants that liability, and no user can upload it!
I actually kinda wonder why nobody has made a #FOSS alternative. I mean, anybody can make a hash...
i can deal with that, once i'm in the productivity zone
I wouldn't, that'd have to be some heinous stuff
if it was just for that purpose, i'd do it, if i can ensure i don't get in legal trouble for it
@alexIt's one area where the government is actually in a position to be uniquely useful. They can deal with validating and publishing the hash list due to the legal concerns, all everyone else has to do is just check against it.

Unfortunately it looks like a case of misaligned incentives though. The FBI runs an effective system for preventing CSAM and avoid 100,000 incidents? Their budget gets slashed because CSAM has become less of a problem. Arrest 100 people for downloading CSAM because it got uploaded to various sites? Promotions and awards all around.
This entry was edited (1 year ago)
@realcaseyrollinsHere's the full correspondence so far as a PDF.
@realcaseyrollinsIf you can craft collisions, that's even more proof the information is not identifiable.