Yesterday, Mastodon was abuzz regarding a strange new scraper that seemed to be pulling people’s profiles and content streams into a platform designed around monetization. Dubbed Content Nation, the site’s combination of strange design, stock images, and focus on getting paid for posts raised more than a few eyebrows. Indeed, the site visually resembles something akin to a domain parking page, with an eye-watering visual layout and strange mix of posts that don’t seem to fit anywhere.
- Holy stock art, Batman!
- I can’t make heads or tails of this.
Some long-standing admins poked and prodded at it, before declaring that Content Nation was, indeed, an effort to scrape Fediverse content for profit. It shared Unlisted posts in search results, seemingly rejected blocks, and deleted materials seemed to reappear on the website nearly instantly. All a person needed to do to verify it was put in their own user handle, and see their posts and profile get scraped out of nowhere.
A lot of people were angry, and readily pulled out the torches and pitchforks. The sad truth, though, was that this wasn’t a malicious scraper trying to crawl the network and make money off of people’s posts. It was some guy’s hobby project to build a service similar to micro.blog.
What is Content Nation?
Content Nation is, essentially, a feed reader for a small publishing community that just happened to be experimenting with ActivityPub. It’s a project developed by Sascha Nitsch, a backend developer and management consultant from Germany. Sascha is a relative outsider to the Fediverse that heard about the network, loved the idea behind it, and tried to integrate his site into the network.
The site is, understandably, somewhat jarring in its appearance, because Sascha is primarily a backend developer, not a frontend designer. He was more interested in building out a robust list of features prior to doing any visual design work, because the platform was still taking shape. As a one-man operation, this kind of approach made the most sense to him.
“The site was and is free,” Sascha wrote, “no ads, no cookies nor tracking. I did not make any money with the federation; it’s a service for users on the platform. And it was never intended to be to make money with those content.”
How did the Fediverse React?
Several people came forward to point out to Sascha that his platform interoperates very, very poorly with Mastodon, and that Sascha did not do sufficient research prior to launching his service. Until recently, Content Nation didn’t properly make use of a User Agent in its request, so it was easy to mistake for a scraper.
Compounding things further, people didn’t realize that the site implemented Webfinger in its search function, allowing people to load remote content by putting in an address into a search field. People would go to Content Nation, search for themselves, and inadvertently kick off a fetch request, leading them to believe they had just been scraped. In reality, this is how 99% of Fediverse servers operate by default.
When users sent GDPR takedowns, Sascha would comply, but the system had nothing in place to block anything. Those same users were distraught to once again search for themselves, only to find their own data all over again.
The High Barrier of Entry for Fediverse Development
The shortcomings described above paint a picture: Sascha was building a free ActivityPub library for his project While he managed to get the basic concepts down, there were still a lot of missing pieces that are essential for participating in the modern Fediverse. Unfortunately, a lot of those resources are not readily available to anyone.
Here’s the thing: if you were to take the ActivityPub specification from the W3C, and implement it as specified, you’d end up with something that wouldn’t correctly talk to any service in use today. Mastodon, and platforms designed to talk to it, have a dozen or so behaviors that are not actually in the spec at all: Webfinger, SharedInbox, Privacy Scopes, and Opt-Out for Search are just a few of them.
Many of these things are almost completely undocumented, and can only be developed by lengthy conversations with people who already built those things. Even Mastodon’s own specs say very little. The majority of people dismissed Content Nation as simply being a malicious attempt to slurp up their public and private content for profit. Even when Sascha tried to defend himself, he was ridiculed and mocked.
From Bad to Worse
Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material.
To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. Just look at the recent case of a teacher who found out that a pornographic video was circulating about one of her students. When she tried to turn in evidence to the police, she was arrested.
It’s a case that causes people to shake their heads: A teacher wanted to help a student whose intimate video was circulating at school and now has to answer in court for, among other things, distributing child pornography.Following a complaint from the public prosecutor’s office, the Koblenz regional court overturned the decision of the Montabaur district court not to open main proceedings in this case. “The regional court, like the public prosecutor, considers the behavior of the accused to be criminal in principle,” said senior public prosecutor Thomas Büttinghaus. The educator is currently facing at least a year in prison – and with it the loss of her civil servant status.
Sascha’s life could have been turned upside down for absolutely nothing. Say what you will about how his website looked, or how his platform functioned: none of these things warranted such a disgusting level of abuse. Somebody basically tried to send a fledgling platform developer to prison, because they didn’t like what he was doing. A series of assumptions and misunderstandings escalated to this point.
Why is this Important?
Over the years, Mastodon’s user culture has become incredibly insular and hostile towards outsiders. Despite repeated claims of “People are just nicer here!” and “Everyone is just so welcoming!”, often those preaching about privacy and consent are the first to harass anyone doing something they don’t like. Reactions have extended to doxxing, death threats, DDoS attacks, and apparently, distribution of CSAM. Just the other week, Mastodon users were harassing a guy who built a protocol bridge that hadn’t even been enabled yet.
Neither of these things are first occurrences, either. People in the past have tried to build tooling for the Fediverse, from search engines to disposable account services for testing to indexes of verified accounts. People like Wil Wheaton were harassed off the network for their ignorance of nuances about who was on a given blocklist that they shared. Some Lemmy instsances have been flooded with CSAM as part of a community retaliation effort from other instances.
Mastodon’s user community have also long looked down their noses at other platforms such as Pleroma, due to a combination of platform rivalry, cultural clashes, personal squabbles, and an “us vs them” mentality. It wasn’t so long ago that simply using Pleroma was considered a valid reason for blocking someone on sight, because good people only used Mastodon.
Source: FediDB.org
Mastodon still makes up the majority of the Fediverse at this point, and acts as a defacto standard for ActivityPub. Many parts of the Mastodon community still threaten to block, doxx, or harass people simply because they expressed a thought or opinion that stands in contrast to what the hive mind demands.
Even Damon, at one point, has received death threats from total strangers for his perspective on FediPact and Threads that other people didn’t agree with. He’s told me on several occasions that the Fediverse doesn’t feel like it was made for people like him, and a good portion of it is due to Mastodon’s user culture.
Whatever this thing is, it’s not sustainable. A big aspect of Mastodon’s norms center around a type of Puritanical culture that half the time, isn’t even consistent with itself. We can’t advocate for a space and say that it’s so much better than everywhere else, when so many people are subjected to this.
The Aftermath
A report was filed with Content Nation’s host, Hetzner, due to the presence of CSAM being detected. However, Sascha’s platform was only set up to cache remote content for an hour prior to purging it. The best conclusion we can draw from this, at the moment, is that someone willingly set Content Nation up.
“I’m not sure it even was CSAM,” Sascha writes in a private chat, “I never saw the pictures, as they had already been deleted. The data was already removed from the cache, and the original server was down, so it wasn’t refreshed [on Content Nation].”
“My flat could have been raided, and I would not have an electronic device left to write this,” he added.
As of this writing, Content Nation has turned off all Fediverse integrations, and Sascha has been turned off of having anything to do with the network after having this experience. He has been effectively bullied off the network.
How can we avoid this happening again?
Throughout researching this article and situation, I think there are several things that really, really need to change for the better. The modern Fediverse operates involves a long list of internal knowledge that’s not really written down anywhere. No part of the ActivityPub spec or Mastodon talks about how to implement their special pieces, so that people writing new servers can be good actors.
As it stands today, no singular piece of Fediverse software includes instructions to load a “worst of the worst” blocklist when setting up an instance, or to put a Webfinger search form behind a login page. What seems like common sense to some people is literally a new concept to others.
Culturally, we need to accept that most people coming into the community for the first time are operating with a lack of prior knowledge. We can’t simply cross our arms and say “You should have known better”, and socially punish people, when in fact there was no way for them to learn about it.
https://wedistribute.org/2024/03/contentnation-mastodons-toxicity/
#CSAM #Harassment #TrustSafety
Looking like we have a new Fedi scraper at contentnation.net, if you're interested in not being a part of that