From Meme to Mob
Overview

Join us for a panel discussion on how hate speech, algorithms, and weak moderation turn online narratives into systemic risks in Europe.

What happens when harmful online narratives are no longer isolated incidents — but persistent systems of amplification that travel across borders, languages, and communities?

Across the European Union, obligations to prevent discrimination and combat hate speech are already clearly established. Under CERD, to which all EU Member States are parties, States are required to prohibit and address the dissemination of discriminatory narratives, while the UN Guiding Principles on Business and Human Rights place responsibilities on Very Large Online Platforms (VLOPs) to address the human rights impacts linked to their systems. Yet where harmful content continues to circulate, spread, and reappear despite repeated reporting efforts, these obligations increasingly come into question.

Join us for a timely discussion exploring how digital platforms have become transnational spaces where narratives move faster than regulation — crossing borders, reshaping discourse, and generating consequences far beyond their original context.

Over the past five years, Foundation Diaspora in Action for Human Rights and Democracy (DAHRD) has documented extensive volumes of online content across major platforms. What initially appeared as isolated incidents gradually revealed something more structural: persistent ecosystems of discrimination. Racial, religious, and identity-based narratives targeting Muslims, migrants, refugees, and other minority communities continue to circulate across pages, groups, and networks, often embedded in memes, humour, coded language, and visual formats that remain difficult to detect and moderate, particularly outside English-language contexts.

These dynamics are further intensified through algorithmic amplification and networked dissemination. Harmful content is repeatedly pushed, recycled, normalised, and, in certain instances, appears to benefit from coordinated amplification patterns that significantly expand its reach and visibility.

The discussion will examine how these patterns engage systemic risks within the meaning of the Digital Services Act, including risks to fundamental rights such as equality and non-discrimination, as well as broader risks to civic discourse, democratic resilience, and social cohesion. Particular attention will be given to the persistence and foreseeability of these harms, and to the obligations of platforms under Articles 34 and 35 of the DSA to assess and mitigate such risks effectively.

These narratives do not remain confined to their place of origin; they circulate within diaspora communities across Member States, interact with local political and social contexts, and increasingly contribute to polarisation, mistrust, and fragmentation. In some cases, online tensions have spilled into offline spaces, illustrating how digital harms can reshape public life and communal relations across Europe.

By bringing together experts from civil society, academia, and digital governance, this event will explore how persistent circulation, amplification, and weak enforcement transform online harms into systemic risks — and what this means for the future of platform accountability and democratic participation in Europe.

Special contribution by Lynn Boylan, Member of the European Parliament (Sinn Féin, Ireland), focusing on platform accountability, systemic risks, and democratic resilience within the EU digital space.

The panel will feature:
• Lynn Boylan — Member of the European Parliament (Sinn Féin, Ireland)
• Ritumbra Manuvie — Assistant Professor at University College Groningen and Executive Director at DAHRD
• Thomas Hughes — CEO of Appeals Centre Europe

📍 Press Club Brussels Europe
📅 3 June 2026
🕓 16:00-18:00

Join us for a panel discussion on how hate speech, algorithms, and weak moderation turn online narratives into systemic risks in Europe.

What happens when harmful online narratives are no longer isolated incidents — but persistent systems of amplification that travel across borders, languages, and communities?

Across the European Union, obligations to prevent discrimination and combat hate speech are already clearly established. Under CERD, to which all EU Member States are parties, States are required to prohibit and address the dissemination of discriminatory narratives, while the UN Guiding Principles on Business and Human Rights place responsibilities on Very Large Online Platforms (VLOPs) to address the human rights impacts linked to their systems. Yet where harmful content continues to circulate, spread, and reappear despite repeated reporting efforts, these obligations increasingly come into question.

Join us for a timely discussion exploring how digital platforms have become transnational spaces where narratives move faster than regulation — crossing borders, reshaping discourse, and generating consequences far beyond their original context.

Over the past five years, Foundation Diaspora in Action for Human Rights and Democracy (DAHRD) has documented extensive volumes of online content across major platforms. What initially appeared as isolated incidents gradually revealed something more structural: persistent ecosystems of discrimination. Racial, religious, and identity-based narratives targeting Muslims, migrants, refugees, and other minority communities continue to circulate across pages, groups, and networks, often embedded in memes, humour, coded language, and visual formats that remain difficult to detect and moderate, particularly outside English-language contexts.

These dynamics are further intensified through algorithmic amplification and networked dissemination. Harmful content is repeatedly pushed, recycled, normalised, and, in certain instances, appears to benefit from coordinated amplification patterns that significantly expand its reach and visibility.

The discussion will examine how these patterns engage systemic risks within the meaning of the Digital Services Act, including risks to fundamental rights such as equality and non-discrimination, as well as broader risks to civic discourse, democratic resilience, and social cohesion. Particular attention will be given to the persistence and foreseeability of these harms, and to the obligations of platforms under Articles 34 and 35 of the DSA to assess and mitigate such risks effectively.

These narratives do not remain confined to their place of origin; they circulate within diaspora communities across Member States, interact with local political and social contexts, and increasingly contribute to polarisation, mistrust, and fragmentation. In some cases, online tensions have spilled into offline spaces, illustrating how digital harms can reshape public life and communal relations across Europe.

By bringing together experts from civil society, academia, and digital governance, this event will explore how persistent circulation, amplification, and weak enforcement transform online harms into systemic risks — and what this means for the future of platform accountability and democratic participation in Europe.

Special contribution by Lynn Boylan, Member of the European Parliament (Sinn Féin, Ireland), focusing on platform accountability, systemic risks, and democratic resilience within the EU digital space.

The panel will feature:
• Lynn Boylan — Member of the European Parliament (Sinn Féin, Ireland)
• Ritumbra Manuvie — Assistant Professor at University College Groningen and Executive Director at DAHRD
• Thomas Hughes — CEO of Appeals Centre Europe

📍 Press Club Brussels Europe
📅 3 June 2026
🕓 16:00-18:00

Lineup

Lynn Boylan

Dr. Ritumbra Manuvie

Thomas Hughes

Good to know

Highlights

  • 2 hours
  • In person

Location

Press Club Brussels Europe

95 Rue Froissart

1040 Bruxelles

How do you want to get there?

Map
Report this event

Still looking for the right event?

Explore all events in Bruxelles and filter by date, category, and more to find the perfect fit.