Search
Items tagged with: TrustSafety
Mastodon Engineering has announced a new network-wide effort to create a framework for service providers offering a variety of capabilities.
https://wedistribute.org/2024/09/fediverse-discovery-providers/
Mastodon Engineering has announced a new network-wide effort to create a framework for service providers offering a variety of capabilities.https://wedistribute.org/2024/09/fediverse-discovery-providers/
IFTAS, the Trust & Safety organization dedicated to the Fediverse, published a guide on the EU’s Digital Services Act earlier this week. The guidance is primarily geared towards instance admins and moderation teams in the Fediverse, and spells out policy obligations for compliance.
If your server has member accounts in the EU, or is publicly viewable in the EU, your service is most likely impacted by this regulation, even if you are not based or hosted in the EU.IFTAS BLOG
Despite the fact that there are now new considerations in place for people who run Fediverse instances, there are a number of positive outcomes here:
- Most Fediverse servers are relatively small communities, and likely fall under the designation of “Small and Micro Enterprises” – if your server doesn’t employ 50 or more people, and doesn’t take an annual income of over 10 million Euros, you’re probably in this designation. Compared to Very Large Online Platforms, your obligations are fewer in comparison.
- Services are not considered liable for content unless they have been notified of its presence. Failure to take said content down is where the liability starts
- Platforms are not held liable for proactively searching for and removing illegal content.
Big Takeaways
Most of what’s listed in EU DSA requirements come across as simple, common sense, and already-adopted practices. The report published by IFTAS provides suggestions for best practices, and even includes templates for the bare minimum requirements for admins to use.
The biggest changes admins need to be aware of mostly relate to accountability: servers need to assign an EU representative for compliance, need to report on malicious content to local authorities, and needs to provide a notice mechanism (such as email) to inform suspended or banned users of a service of the action taken, as well as the reason.
On the matter of being required to assign an EU-based representative for compliance, IFTAS states the following:
If your service is accessible to users in the EU but you don’t have an establishment there, you’re required to designate either a legal or natural person as your legal representative within one of the Member States where you offer services.Your legal representative will act on your behalf for interactions with the EU’s Member States’ authorities, regarding compliance with and enforcement of the DSA. You must notify the Digital Services Coordinator in the Member State of your representative’s name, postal address, email address, and telephone number. Ensure this information is public, easily accessible, accurate, and regularly updated. Most legal representative services do not offer an affordable option for donation-driven services.
IFTAS is researching ways to offer this service, but in the meantime, we strongly recommend you ensure you have added a designated contact for authorities to reach you
This will probably be the biggest hurdle for instance operators in the short term, as many instances are roughly the same size as a community group or forum, in terms of users. This is a new consideration for a lot of admins, and not necessarily one that anybody’s given thought to in the past fifteen years.
Regardless, IFTAS has produced a valuable service with lots of insight, advice, and examples for server admins. Hopefully, this will serve as a benefit to the Fediverse as many attempt to navigate compliance for the very first time.
https://wedistribute.org/2024/04/iftas-dsa-guide/
#Moderation #Regulation #TrustSafety
IFTAS is happy to announce the public availability of our DSA Guide for Decentralized Services – a practical guide for small and micro services that are subject to the EU’s Digital Services Act.
If your server has member accounts in the EU, or is publicly viewable in the EU, your service is most likely impacted by this regulation, even if you are not based or hosted in the EU.
Developed in collaboration with the great people at Tremau, our DSA Guide is designed to help independent social media service providers navigate these complex regulations and achieve compliance with these new rules without compromising the unique qualities of federated, open social networks.
As part of our Needs Assessment activities, we’ve heard a repeated need for help understanding the complex regulatory landscape that decentralized services need to consider, and this DSA Guide is the first of many in our plan to provide clear, actionable guidance to a range of regulations for the community.
As of February 2024, all online services and digital platforms that offer services in the European Union are required to be fully compliant with the DSA.
However, various portions of the DSA are not applicable to “small and micro” services, and this guide will show you clearly which parts apply and which do not.
For administrators of platforms like Mastodon, PeerTube, and Pixelfed, the DSA Guide can help demystify the requirements and offer practical advice on achieving compliance for the over 27,000 independent operators of these and other decentralized social media services who otherwise may not be able to obtain the guidance and advice that larger operations can afford to invest in.
Download the DSA Guide for Decentralized Fediverse Services.
To join the discussion, visit our community chat service at https://matrix.to/#/#space:matrix.iftas.org or stay tuned to join our community portal in the coming weeks!
https://about.iftas.org/2024/04/09/dsa-guide-for-the-fediverse/
#ActivityPub #BetterSocialMedia #DSA #Fediverse
The extraterritorial implications of the Digital Services Act - DSA Observatory
Laureline Lemoine & Mathias Vermeulen (AWO) As the enforcement of the Digital Services Act (DSA) is gathering speed, a number of non-EU based civil society and research organizations have wondered to what extent the DSA can have an impact on their wo…admin (DSA Observatory)
Yesterday, Mastodon was abuzz regarding a strange new scraper that seemed to be pulling people’s profiles and content streams into a platform designed around monetization. Dubbed Content Nation, the site’s combination of strange design, stock images, and focus on getting paid for posts raised more than a few eyebrows. Indeed, the site visually resembles something akin to a domain parking page, with an eye-watering visual layout and strange mix of posts that don’t seem to fit anywhere.
- Holy stock art, Batman!
- I can’t make heads or tails of this.
Some long-standing admins poked and prodded at it, before declaring that Content Nation was, indeed, an effort to scrape Fediverse content for profit. It shared Unlisted posts in search results, seemingly rejected blocks, and deleted materials seemed to reappear on the website nearly instantly. All a person needed to do to verify it was put in their own user handle, and see their posts and profile get scraped out of nowhere.
A lot of people were angry, and readily pulled out the torches and pitchforks. The sad truth, though, was that this wasn’t a malicious scraper trying to crawl the network and make money off of people’s posts. It was some guy’s hobby project to build a service similar to micro.blog.
What is Content Nation?
Content Nation is, essentially, a feed reader for a small publishing community that just happened to be experimenting with ActivityPub. It’s a project developed by Sascha Nitsch, a backend developer and management consultant from Germany. Sascha is a relative outsider to the Fediverse that heard about the network, loved the idea behind it, and tried to integrate his site into the network.
The site is, understandably, somewhat jarring in its appearance, because Sascha is primarily a backend developer, not a frontend designer. He was more interested in building out a robust list of features prior to doing any visual design work, because the platform was still taking shape. As a one-man operation, this kind of approach made the most sense to him.
“The site was and is free,” Sascha wrote, “no ads, no cookies nor tracking. I did not make any money with the federation; it’s a service for users on the platform. And it was never intended to be to make money with those content.”
How did the Fediverse React?
Several people came forward to point out to Sascha that his platform interoperates very, very poorly with Mastodon, and that Sascha did not do sufficient research prior to launching his service. Until recently, Content Nation didn’t properly make use of a User Agent in its request, so it was easy to mistake for a scraper.
Compounding things further, people didn’t realize that the site implemented Webfinger in its search function, allowing people to load remote content by putting in an address into a search field. People would go to Content Nation, search for themselves, and inadvertently kick off a fetch request, leading them to believe they had just been scraped. In reality, this is how 99% of Fediverse servers operate by default.
When users sent GDPR takedowns, Sascha would comply, but the system had nothing in place to block anything. Those same users were distraught to once again search for themselves, only to find their own data all over again.
The High Barrier of Entry for Fediverse Development
The shortcomings described above paint a picture: Sascha was building a free ActivityPub library for his project While he managed to get the basic concepts down, there were still a lot of missing pieces that are essential for participating in the modern Fediverse. Unfortunately, a lot of those resources are not readily available to anyone.
Here’s the thing: if you were to take the ActivityPub specification from the W3C, and implement it as specified, you’d end up with something that wouldn’t correctly talk to any service in use today. Mastodon, and platforms designed to talk to it, have a dozen or so behaviors that are not actually in the spec at all: Webfinger, SharedInbox, Privacy Scopes, and Opt-Out for Search are just a few of them.
Many of these things are almost completely undocumented, and can only be developed by lengthy conversations with people who already built those things. Even Mastodon’s own specs say very little. The majority of people dismissed Content Nation as simply being a malicious attempt to slurp up their public and private content for profit. Even when Sascha tried to defend himself, he was ridiculed and mocked.
From Bad to Worse
Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material.
To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. Just look at the recent case of a teacher who found out that a pornographic video was circulating about one of her students. When she tried to turn in evidence to the police, she was arrested.
It’s a case that causes people to shake their heads: A teacher wanted to help a student whose intimate video was circulating at school and now has to answer in court for, among other things, distributing child pornography.Following a complaint from the public prosecutor’s office, the Koblenz regional court overturned the decision of the Montabaur district court not to open main proceedings in this case. “The regional court, like the public prosecutor, considers the behavior of the accused to be criminal in principle,” said senior public prosecutor Thomas Büttinghaus. The educator is currently facing at least a year in prison – and with it the loss of her civil servant status.
Sascha’s life could have been turned upside down for absolutely nothing. Say what you will about how his website looked, or how his platform functioned: none of these things warranted such a disgusting level of abuse. Somebody basically tried to send a fledgling platform developer to prison, because they didn’t like what he was doing. A series of assumptions and misunderstandings escalated to this point.
Why is this Important?
Over the years, Mastodon’s user culture has become incredibly insular and hostile towards outsiders. Despite repeated claims of “People are just nicer here!” and “Everyone is just so welcoming!”, often those preaching about privacy and consent are the first to harass anyone doing something they don’t like. Reactions have extended to doxxing, death threats, DDoS attacks, and apparently, distribution of CSAM. Just the other week, Mastodon users were harassing a guy who built a protocol bridge that hadn’t even been enabled yet.
Neither of these things are first occurrences, either. People in the past have tried to build tooling for the Fediverse, from search engines to disposable account services for testing to indexes of verified accounts. People like Wil Wheaton were harassed off the network for their ignorance of nuances about who was on a given blocklist that they shared. Some Lemmy instsances have been flooded with CSAM as part of a community retaliation effort from other instances.
Mastodon’s user community have also long looked down their noses at other platforms such as Pleroma, due to a combination of platform rivalry, cultural clashes, personal squabbles, and an “us vs them” mentality. It wasn’t so long ago that simply using Pleroma was considered a valid reason for blocking someone on sight, because good people only used Mastodon.
Source: FediDB.org
Mastodon still makes up the majority of the Fediverse at this point, and acts as a defacto standard for ActivityPub. Many parts of the Mastodon community still threaten to block, doxx, or harass people simply because they expressed a thought or opinion that stands in contrast to what the hive mind demands.
Even Damon, at one point, has received death threats from total strangers for his perspective on FediPact and Threads that other people didn’t agree with. He’s told me on several occasions that the Fediverse doesn’t feel like it was made for people like him, and a good portion of it is due to Mastodon’s user culture.
Whatever this thing is, it’s not sustainable. A big aspect of Mastodon’s norms center around a type of Puritanical culture that half the time, isn’t even consistent with itself. We can’t advocate for a space and say that it’s so much better than everywhere else, when so many people are subjected to this.
The Aftermath
A report was filed with Content Nation’s host, Hetzner, due to the presence of CSAM being detected. However, Sascha’s platform was only set up to cache remote content for an hour prior to purging it. The best conclusion we can draw from this, at the moment, is that someone willingly set Content Nation up.
“I’m not sure it even was CSAM,” Sascha writes in a private chat, “I never saw the pictures, as they had already been deleted. The data was already removed from the cache, and the original server was down, so it wasn’t refreshed [on Content Nation].”
“My flat could have been raided, and I would not have an electronic device left to write this,” he added.
As of this writing, Content Nation has turned off all Fediverse integrations, and Sascha has been turned off of having anything to do with the network after having this experience. He has been effectively bullied off the network.
How can we avoid this happening again?
Throughout researching this article and situation, I think there are several things that really, really need to change for the better. The modern Fediverse operates involves a long list of internal knowledge that’s not really written down anywhere. No part of the ActivityPub spec or Mastodon talks about how to implement their special pieces, so that people writing new servers can be good actors.
As it stands today, no singular piece of Fediverse software includes instructions to load a “worst of the worst” blocklist when setting up an instance, or to put a Webfinger search form behind a login page. What seems like common sense to some people is literally a new concept to others.
Culturally, we need to accept that most people coming into the community for the first time are operating with a lack of prior knowledge. We can’t simply cross our arms and say “You should have known better”, and socially punish people, when in fact there was no way for them to learn about it.
https://wedistribute.org/2024/03/contentnation-mastodons-toxicity/
#CSAM #Harassment #TrustSafety
Looking like we have a new Fedi scraper at contentnation.net, if you're interested in not being a part of that
Twitter rival Mastodon isn’t safe from online mobs either
The mass reporting and suspension of actor Wil Wheaton prompts the open-source platform to examine its moderation tools.Megan Farokhmanesh (The Verge)