Safer spaces by design? Federated architectures and alternative socio-technical models for content moderation
Résumé
Users of secure messaging tools, especially in communities attuned to the risks of State-based and other forms of censorship, are becoming increasingly skeptical about the fact of delegating their data to centralized platforms, endowed with substantial power to filter content and block user profiles. This paper analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes. Alongside privacy by design, the paper asks, is it possible to speak of online "safe(r) spaces by design"? The paper argues that federation can pave the way for novel practices in content moderation governance, merging community organizing, information distribution and alternative techno-social instruments to deal with online harassment, hate speech or disinformation; however, this alternative also presents a number of pitfalls and potential difficulties that need to be examined to provide a complete picture of the potential of federated models.
Origine | Fichiers produits par l'(les) auteur(s) |
---|