CSO Challenges and Opportunities under the DSA
Civil Society Organisations (CSOs) have long been on the frontlines in the fight against online hate speech. These organisations monitor digital platforms, report incidents of illegal hate, and advocate for stronger policies and enforcement to protect vulnerable communities. Their efforts are essential for holding platform providers accountable and ensuring the digital space remains inclusive for all.
The Digital Services Act (DSA) became effective in February 2024, marking a significant milestone in European Union legislation. It was introduced to regulate online platforms and address the proliferation of illegal content, including hate speech.
With unprecedented access to social media data under the DSA, CSOs are expected to take on greater responsibilities in ensuring compliance and enforcement. However, many CSOs face significant challenges in adapting to the new regulatory frameworks. Key concerns include a lack of clarity on procedural aspects of enforcement and the need for a more transparent and systematic regulatory approach.
Additionally, a key insight from the first iteration of EOOH revealed that many CSO experts were hesitant to adopt AI-driven technology for hate speech monitoring, making it difficult to process the vast amounts of data from online sources effectively.
To further explore these challenges, EOOH conducted a needs assessment, providing an in-depth examination of CSOs across the European Union's difficulties in monitoring and reporting illegal hate speech, particularly within the Digital Services Act (DSA) framework. Using a qualitative research approach, the assessment gathered insights from 30 CSOs across 18 EU member states*. It focused on current practices, challenges, and the technological needs of CSOs in their efforts to tackle hate speech. The findings from this assessment form the basis of the key insights and recommendations presented in this article.
*the country representation was uneven, with a higher concentration of participants from Germany.
Challenges in engaging with new regulations
CSOs had a wide range of experiences in their interactions with Digital Services Coordinators (DSCs), who are pivotal to DSA implementation. While some CSOs have successfully established communication with DSCs, many others remain unaware of their role or are unsure how to engage with them. This inconsistency highlights a significant gap in awareness, which undermines effective engagement with the DSA’s provisions. Greater coordination and outreach are essential to ensure that all CSOs, irrespective of their size or location, can navigate this regulatory landscape effectively.
Trusted flagger: opportunity or challenge?
Among the opportunities introduced by the DSA, one aspect that has attracted the attention of many CSOs is the potential to apply for Trusted Flagger status. This status provides recognised organisations with special privileges, enabling them to report illegal content directly to platforms and expedite its removal. While many CSOs are eager to explore this opportunity, there are notable concerns. The application process is demanding, with strict criteria that some organisations view as overly restrictive. Moreover, the financial and resource commitments required to maintain Trusted Flagger status have led some organisations to hesitate, questioning whether the benefits truly outweigh the costs.
Monitoring gaps and platform priorities
When it comes to monitoring hate speech, most CSOs focus on mainstream social media platforms such as Facebook, Instagram, X (formerly Twitter), TikTok, and YouTube, which, given their large user bases, are frequent targets for harmful content. However, fringe or alternative platforms are often under-monitored due to limited resources, allowing hate speech to spread unchecked.
Reporting confusion and platform inconsistencies
Many CSOs experienced a lack of clarity regarding changes to reporting processes under the DSA, especially since the reforms were introduced in February 2024. Although most organisations continue to use in-app reporting mechanisms, they struggle with complex legal terminology and the increased time and effort needed to comply with new regulations. Additionally, organisations noted a decrease in the removal rates of hate speech content, with platforms showing inconsistent enforcement of takedown and stay-down policies.
Collaboration with law enforcement: a mixed picture
Collaboration between CSOs and Law Enforcement Agencies (LEAs) has also been inconsistent. While some organisations maintain regular, productive partnerships with LEAs—which is crucial in addressing hate speech that escalates into more serious offences—others have limited or no interaction with law enforcement. This inconsistency points to the need for more structured and formalised partnerships to ensure CSOs and LEAs can work together more effectively in tackling online hate.
Technological needs to improve effectiveness
Despite growing awareness of technological solutions, many CSOs still rely heavily on manual processes for monitoring and reporting hate speech. This is not due to a lack of interest—most organisations acknowledge the potential benefits of automating workflows, improving data analysis, and increasing efficiency. In particular, tools like automatic detection systems, enhanced data access, automated reporting, and removal alerts could significantly improve CSOs’ ability to combat online hate speech. CSOs also emphasised the need for simplified, standardised reporting mechanisms across social media platforms. Currently, navigating the different processes of various platforms slows down reporting workflows and hampers overall efficiency. A more uniform approach would allow CSOs to focus less on administrative tasks and more on their core mission: advocating for policy changes and protecting vulnerable communities from online hate.
To effectively address these issues, stronger collaboration between CSOs, regulatory bodies, technology providers, and social media platforms is essential. By bridging the technological and resource gaps highlighted in this assessment, CSOs will be better positioned to monitor, report, and curb the spread of hate speech online.
The following recommendations outline key steps that can help achieve these goals.
Special reporting access for trusted flaggers
Implement streamlined reporting mechanisms for recognised Trusted Flaggers to enhance the efficiency of hate speech reporting. This could include prioritising Trusted Flagger reports, providing expedited review processes, and ensuring timely feedback on reported content.Technology awareness and resources
Develop comprehensive programmes to raise awareness among CSOs about available technological solutions for hate speech monitoring. This could include resources, training, and support for adopting these tools, especially for monitoring fringe platforms.DSA and DSC awareness initiatives
Continue and expand awareness-raising activities such as the "Zooming in on Hate" podcasts, online clinics, and training sessions, ensuring that CSOs are well-informed about the DSA framework and the role of Digital Services Coordinators.Collaboration initiatives with LEAs
Foster stronger collaboration between CSOs and LEAs through structured programmes and communication channels. This could involve the creation of dedicated task forces, joint training sessions, and clear protocols for sharing information and evidence related to online hate.Standardisation of reporting across platforms
Advocate for a standardised reporting interface across major social media platforms. This would simplify the reporting process for CSOs, reducing the administrative burden and enabling more efficient hate speech monitoring.
Our needs assessment underscores the significant challenges CSOs face in areas, such as reporting mechanisms, resource allocation, and navigating the complexities of the DSA framework, despite growing awareness of available technological solutions.
Overcoming these challenges will require collaboration among CSOs, technology providers, social media platforms, and regulatory bodies. By addressing these critical needs, we can enhance our efforts to combat online hate speech and ensure that CSOs are fully equipped to fulfil their vital role in fostering a safer and more inclusive digital environment.