You must log in or register to comment.
This is good … There were SO many illegal CSAM posts in a group that I was in. I had to leave because it was disgusting to even see the words being shown.
Edit: I checked back and the first notification I got was a telegram spam bot (luckily not CSAM oriented this time)
How do u do moderation on an encrypted decentralised platform?
It is mostly about giving users tools to do moderation. So managers of communities can effectively apply policies and make it easy for people to share moderation decisions so that the work can be shared among communities that trust each other’s moderation decisions.
I need a tldr bot for these
Here ya go
Title: Building a Safer Matrix
- Need for Secure Communication: The document emphasizes the growing necessity for secure communication due to increasing security breaches and geopolitical tensions, highlighting the importance of privacy for vulnerable groups.
- Matrix’s Decentralized Nature: Matrix is designed as a decentralized communication protocol, allowing organizations to run their own servers while connecting to a broader network, enhancing user control and privacy.
- Foundation’s Role: The Matrix.org Foundation, established as a non-profit, oversees the protocol’s development and ensures the safety of the public Matrix network, funded by donations from members.
- Addressing Abuse: The Foundation acknowledges the potential for abuse within the platform and has implemented concrete anti-abuse measures, contrasting with many other encrypted messengers that often avoid responsibility.
- Safety Team Operations: The Foundation has a dedicated Safety team that actively investigates and removes harmful content, facing significant mental health challenges due to the nature of the content they handle.
- Safety Tooling: Matrix employs a combination of proactive and reactive strategies to combat online harms, including reporting systems, moderation tools, and engagement with law enforcement and civil society.
- Focus on Child Exploitation: A significant focus of the safety efforts is on combating child sexual exploitation and abuse (CSEA), involving collaborations with various organizations and implementing detection technologies.
- Recent Updates and Improvements: The Foundation has made recent advancements in safety tooling, including account suspension capabilities and measures to prevent the misuse of Matrix as a content distribution network.
- Community Engagement: The document highlights the importance of community-driven initiatives and collaborations with external groups to enhance safety and moderation efforts within the Matrix ecosystem.
- Future Goals: The Foundation aims to improve reporting systems, increase transparency through regular reports, and enhance user engagement in safety practices, while continuing to uphold user privacy and security.