From Meme to Mob
Join us for a panel discussion exploring how harmful online narratives, and moderation challenges contribute to systemic risks within Europe
What happens when harmful online narratives are no longer isolated incidents — but persistent systems of amplification that travel across borders, languages, and communities?
Across the European Union, obligations to prevent discrimination and combat hate speech are already clearly established. Under CERD, to which all EU Member States are parties, States are required to prohibit and address the dissemination of discriminatory narratives, while the UN Guiding Principles on Business and Human Rights place responsibilities on Very Large Online Platforms (VLOPs) to address the human rights impacts linked to their systems. Yet where harmful content continues to circulate, spread, and reappear despite repeated reporting efforts, these obligations increasingly come into question.
Join us for a critical discussion exploring how digital platforms have become transnational spaces where narratives move faster than regulation — crossing borders, reshaping discourse, and generating consequences far beyond their original context.
A central part of the discussion will focus on DAHRD’s findings based on cases reviewed through the Appeals Centre Europe mechanism established under the Digital Services Act framework. Through its monitoring and reporting work, DAHRD documented patterns of persistent under-enforcement involving anti-Muslim hate speech in non-EU languages. The report analysed 49 posts submitted through the Appeals Centre Europe process and found that 51% of the original moderation decisions were later overturned following review. Across 28 documented cases, harmful content remained online for a cumulative total of 4,637 days, with posts staying accessible for an average of 166 days before removal, and in some cases up to 429 days.
The discussion will examine the practical implementation of the Digital Services Act and the extent to which existing enforcement and mitigation mechanisms are equipped to respond to multilingual and cross-border harms in practice. While systemic risks are already established under Articles 34 and 35 of the DSA, the findings discussed during the event raise broader questions regarding how these obligations are operationalised, particularly in relation to minority-language moderation, contextual assessment, and the role of civil society organisations in identifying and documenting harms that platforms and enforcement mechanisms may otherwise fail to detect.
Particular attention will be given to how moderation systems and complain mechanisms continue to rely on external expertise to interpret culturally and linguistically specific content, raising concerns about unequal protection, implementation gaps, and the accessibility of DSA safeguards for minority and diaspora communities within Europe.
By bringing together experts from civil society, academia, and policymaking, this event will explore how persistent circulation, amplification, and weak enforcement expose broader challenges in the implementation of the Digital Services Act, and what this means for the future of platform accountability, multilingual moderation, and democratic participation in Europe.
Special contribution by Lynn Boylan, Member of the European Parliament (the left group), focusing on platform accountability, systemic risks, and democratic resilience within the EU digital space.
The panel will feature:
• Thomas Hughes — CEO of Appeals Centre Europe
• Ritumbra Manuvie — Assistant Professor at University College Groningen and Executive Director at DAHRD
• Lynn Boylan — Member of the European Parliament (the Left group)
📍 Press Club Brussels Europe
📅 3 June 2026
🕓 16:00-18:00
Join us for a panel discussion exploring how harmful online narratives, and moderation challenges contribute to systemic risks within Europe
What happens when harmful online narratives are no longer isolated incidents — but persistent systems of amplification that travel across borders, languages, and communities?
Across the European Union, obligations to prevent discrimination and combat hate speech are already clearly established. Under CERD, to which all EU Member States are parties, States are required to prohibit and address the dissemination of discriminatory narratives, while the UN Guiding Principles on Business and Human Rights place responsibilities on Very Large Online Platforms (VLOPs) to address the human rights impacts linked to their systems. Yet where harmful content continues to circulate, spread, and reappear despite repeated reporting efforts, these obligations increasingly come into question.
Join us for a critical discussion exploring how digital platforms have become transnational spaces where narratives move faster than regulation — crossing borders, reshaping discourse, and generating consequences far beyond their original context.
A central part of the discussion will focus on DAHRD’s findings based on cases reviewed through the Appeals Centre Europe mechanism established under the Digital Services Act framework. Through its monitoring and reporting work, DAHRD documented patterns of persistent under-enforcement involving anti-Muslim hate speech in non-EU languages. The report analysed 49 posts submitted through the Appeals Centre Europe process and found that 51% of the original moderation decisions were later overturned following review. Across 28 documented cases, harmful content remained online for a cumulative total of 4,637 days, with posts staying accessible for an average of 166 days before removal, and in some cases up to 429 days.
The discussion will examine the practical implementation of the Digital Services Act and the extent to which existing enforcement and mitigation mechanisms are equipped to respond to multilingual and cross-border harms in practice. While systemic risks are already established under Articles 34 and 35 of the DSA, the findings discussed during the event raise broader questions regarding how these obligations are operationalised, particularly in relation to minority-language moderation, contextual assessment, and the role of civil society organisations in identifying and documenting harms that platforms and enforcement mechanisms may otherwise fail to detect.
Particular attention will be given to how moderation systems and complain mechanisms continue to rely on external expertise to interpret culturally and linguistically specific content, raising concerns about unequal protection, implementation gaps, and the accessibility of DSA safeguards for minority and diaspora communities within Europe.
By bringing together experts from civil society, academia, and policymaking, this event will explore how persistent circulation, amplification, and weak enforcement expose broader challenges in the implementation of the Digital Services Act, and what this means for the future of platform accountability, multilingual moderation, and democratic participation in Europe.
Special contribution by Lynn Boylan, Member of the European Parliament (the left group), focusing on platform accountability, systemic risks, and democratic resilience within the EU digital space.
The panel will feature:
• Thomas Hughes — CEO of Appeals Centre Europe
• Ritumbra Manuvie — Assistant Professor at University College Groningen and Executive Director at DAHRD
• Lynn Boylan — Member of the European Parliament (the Left group)
📍 Press Club Brussels Europe
📅 3 June 2026
🕓 16:00-18:00
Lineup
Thomas Hughes
Dr. Ritumbra Manuvie
Lynn Boylan
Good to know
Highlights
- 2 hours
- In person
Location
Press Club Brussels Europe
95 Rue Froissart
1040 Bruxelles
How do you want to get there?
