For community platforms, the DSA introduces numerous changes and new obligations to better protect users against harmful behaviour and illegal content. In this blog post, we provide an overview of the most important DSA guidelines - from the perspective of a community platform.
One of the most significant changes is the obligation to establish user-friendly reporting and redress procedures. Platforms must provide mechanisms for users to easily and effectively report illegal content. This requires platforms to invest in intuitive reporting tools and ensure timely processing of reports.
Hosting services must remove illegal content immediately or block access to it. This requires the development of efficient moderation tools and processes that enable the rapid identification and removal of such content.
When content is removed or blocked, the affected users must be informed. This notification must be transparent and comprehensible, providing users with the opportunity to contest these decisions. This makes shadow banning impossible. A method that many platforms have been keen to use up to now.
In addition, users must have access to an effective internal complaints management system after a decision has been made. This will provide users with a tool to directly and easily lodge complaints against deleted content that violates the platform's terms and conditions.
Very large Community platforms are now required to disclose their moderation practices. This includes publishing yearly transparency reports detailing measures taken to combat illegal content and harmful behaviour. These reports must be uploaded to the DSA Transparency Database, ensuring they are publicly accessible in a timely manner. This transparency allows users to consistently view when and why content was deleted from a platform.
A trusted flagger is a person or organisation that is recognised as particularly reliable and competent when it comes to reporting illegal content on the internet. These trusted flaggers have proven experience and expertise in their field and can make accurate and well-founded reports. Therefore, their reports should be processed faster and prioritised in order to remove or block illegal content more efficiently. This helps to remove harmful content more quickly and make the online community safer.
Very large online platforms (VLOPs) must assess and mitigate risks, including the spread of illegal content and manipulation by fake accounts. Platforms must develop and implement specific measures to minimise these risks. These include algorithms to detect fake accounts, improved security protocols and advanced moderation technologies.
Article 52 of the DSA outlines conditions and penalties for all digital service providers that fail to comply. Member states are to impose fines up to 6% of a provider's annual revenue or turnover. This substantial penalty empowers national media authorities and the EU Commission to enforce the DSA effectively.
In order to comply with the new obligations of the DSA, all community platforms must take these technological and organisational measures:
Many sections in the DSA require methods that will increase your overall moderation efforts dramatically, such as justifying every moderated piece of content, managing internal complaints systems, and dealing with prioritised reporters. To bring your moderation efforts to a minimum, it is crucial to reduce hate and disinformation from the outset. Trusted Accounts offers all the necessary tools to block bots and trolls that spread disinformation and hate while retaining real users on your platform, thereby saving you a substantial amount of moderation work.
The Digital Services Act presents community platforms with new challenges, but also offers the opportunity to strengthen user trust and ensure platform integrity through improved security and transparency measures. Ultimately, embracing DSA guidelines can position your platform as a responsible digital entity that prioritises user safety and regulatory compliance in the evolving digital landscape.
https://wikimedia.brussels/the-eus-new-content-moderation-rules-community-driven-platforms/
https://www.weforum.org/agenda/2022/12/how-companies-prepare-digital-services-act/
Stop abusive users and bots from creating multiple accounts. Establish an effective user moderation.