KONSTANTINOS KOMAITIS
  • About me...
  • Write. Share. Ignite.
  • Byline
  • Media
  • Books
  • "Internet of Humans" podcast

Write. Share. Ignite.

Children, Rights, and Democratic Resilience in Europe’s Platform Debate

2/9/2026

 
Picture
Across Europe, concern about the online environment is intensifying. Digital platforms are increasingly associated with the spread of disinformation, political polarisation, and social harms—particularly for children. In a context marked by geopolitical uncertainty, eroding trust in institutions, and fragmented public debate, it is understandable that governments feel compelled to respond. The issue is no longer whether some form of regulation is needed, but how such measures are conceived, justified, and put into practice.

In recent years, attention has often centred on the power of large technology companies, whose scale and influence over public discourse are unprecedented. While these concerns are well founded, there is a parallel risk that regulation becomes a vehicle for broader state intervention in the digital public sphere. Under the language of democracy, security, or national autonomy, efforts to address genuine problems online may slide into attempts to exert greater control over speech and information flows. The challenge, therefore, lies in developing regulatory frameworks that meaningfully address online harms without undermining the open and pluralistic character of democratic debate.

Spain’s Prime Minister Pedro Sánchez has become one of the most prominent advocates of a hard-line approach. His proposals—including ending anonymity on social media, holding platform executives criminally liable for illegal or “hateful” content, criminalising certain forms of algorithmic amplification, and adopting a “zero-tolerance” enforcement posture—are framed as necessary responses to a digital environment he has described as a “failed state.” The stated objective is to protect democracy and society, particularly the most vulnerable.

These concerns should not be dismissed. Children do face real harms online: exposure to abuse, predatory behaviour, harassment, self-harm content, and manipulative design practices that exploit their attention. Marginalised communities are disproportionately targeted by online hate and coordinated harassment. Platforms have often been slow, inconsistent, or opaque in responding. Governments are right to demand higher standards of care, transparency, and responsibility.

At the same time, regulation that focuses primarily on punitive control risks overshooting its target.
When policies emphasise criminal liability, prosecutorial investigations, and broad content restrictions, their effects rarely stop with powerful tech executives. They cascade down to millions of ordinary users—people discussing immigration, foreign policy, public health, religion, or identity. These are precisely the topics that become most sensitive during periods of political uncertainty and social change. In such contexts, the line between combating harm and constraining legitimate democratic disagreement can become dangerously thin.

Ending anonymity, for example, may reduce some forms of abuse, but it also removes a vital layer of protection for whistleblowers, political dissidents, journalists, survivors of violence, LGBTQ+ youth, and members of ethnic or religious minorities. For many, anonymity is not about evading responsibility; it is about participating at all. Any policy that treats anonymity primarily as a problem risks silencing voices that democracy most needs to hear.

Similarly, holding executives personally criminally liable for content decisions may create powerful incentives—but not necessarily the right ones. Faced with the risk of prosecution, platforms are likely to default to over-removal, automated filtering, and risk-averse moderation. 

This risk is not merely theoretical; Europe has seen it before. This is not an argument against regulation, but a reminder that it must be designed carefully. When Germany introduced the Network Enforcement Act (NetzDG), which imposed significant fines for failing to remove illegal content within short timeframes, platforms responded by erring on the side of caution. Numerous lawful posts—including satire, political commentary, and journalistic content—were removed or blocked because platforms prioritised legal risk reduction over contextual judgment. Similar dynamics emerged following the introduction of Article 17 of the EU Copyright Directive, where automated upload filters led to the removal or blocking of lawful material such as memes, parodies, and educational content. These risks were sufficiently significant that the Court of Justice of the European Union intervened to clarify and limit the scope of such filtering obligations, emphasising that any implementation must respect fundamental rights, including freedom of expression and information.

These examples illustrate how heightened liability and unclear standards can incentivise over-removal, automated filtering, and risk-averse moderation. While such approaches may reduce visible controversy, they also suppress lawful speech and disproportionately affect minority voices and political dissent. The resulting chilling effect is difficult to quantify, but its impact on democratic participation and public debate is real and enduring.

Europe’s historical experience makes this tension particularly salient. Countries like Spain know intimately what it means to live under systems where speech is tightly controlled in the name of order, unity, or national interest. That legacy has shaped Europe’s strong commitment to fundamental rights, proportionality, and the understanding that democracy depends not only on security, but on pluralism and open debate. Yet this concern is not only historical. Over the coming years, several European countries—including Hungary, France, Germany, Italy, Spain, Poland, and Greece—will hold elections that may significantly reshape their political landscapes, as voters weigh competing visions of governance, identity, and democratic norms. Regulatory powers designed today under centrist or liberal administrations may look very different if exercised by future governments with a more exclusionary or authoritarian approach to dissent. Measures introduced in the name of protecting democracy can, under changed political circumstances, become tools for narrowing it. This reality underscores the need to design digital regulation with long-term resilience in mind—grounded in rights, safeguards, and institutional restraint, rather than trust in the intentions of any one government.

This is why the current framing of “digital sovereignty” deserves careful scrutiny. Once understood primarily as a strategy for technological resilience and strategic autonomy, it is increasingly politicised as a justification for assertive state intervention in online discourse. At times, digital sovereignty is presented as inherently at odds with open, transnational communication—despite the fact that democracy itself has always relied on cross-border flows of ideas, information, and innovation.

This tension between state control and open communication is most visible in the debate over child safety, where 'digital sovereignty' is frequently invoked as a shield for restrictive policies. But, children deserve more than symbolic protection. They need safer digital environments, but they also need to be empowered—to learn, explore, create, and participate. This requires age-appropriate design, meaningful transparency, digital literacy, robust reporting mechanisms, and enforceable duties of care. It does not require turning the internet into a heavily policed space where speech is filtered primarily through fear of punishment.

Crucially, children also grow into citizens. Protecting them should not come at the cost of hollowing out the democratic culture they will inherit. An online environment stripped of contestation, anonymity, and diversity of expression may be calmer—but it will also be poorer, less resilient, and less capable of absorbing social conflict without repression.

The same applies to marginalised communities. Regulation that prioritises order over rights often ends up reinforcing existing power imbalances. Groups that already face discrimination offline are frequently the first to feel the effects of broad speech controls online. A rights-based approach must therefore be central, not incidental, to digital governance.

None of this implies that Europe should be passive or naïve. Platforms must be held to account—but accountability should be precise, transparent, and proportionate. It should focus on systems and incentives rather than individual speech acts; on due process rather than zero-tolerance rhetoric; on empowerment rather than control.

This is not the moment for Europe to “show its teeth” by asserting authority over digital discourse in ways that blur the line between regulation and repression. It is the moment to show confidence: confidence that democratic societies can address harm without sacrificing fundamental freedoms, that children can be protected without being over-shielded, and that innovation and rights can coexist.

The challenge of the digital age is not simply taming Big Tech. It is learning how to govern a pluralistic, networked public sphere without turning fear into a substitute for judgment. Europe’s strength has always been its commitment to balance. That commitment is needed now more than ever.

Comments are closed.

    Categories

    All
    5G
    Accountability
    Acpa
    Appeal
    .bank
    Book On The Current State Of Domain Name Regulation
    Cartagena
    Cctlds
    China
    Civil Society
    Coica
    Collaboration
    Conference
    Copyright
    Copyright Infringement
    Counterfeit Goods
    Criminal Activity
    Czech Arbitration Court
    Dag4
    Dakar
    Default
    Democracy
    Digital Sovereignty
    Dns
    Domain Name
    Domain Names
    Domain Names.
    Encryption
    E-PARASITE ACT
    Europe
    Fair Use
    Free Speech
    Froomkin
    G20
    Gac
    Giganet
    Gnso
    Governmental Advisory Committee
    Gtlds
    Hargreaves Report
    Icann
    Icann Board
    In Rem
    In Rem Jurisdiction
    Intellectual Property
    Intergovernmental Organizations
    International Olympic Committee
    Internet
    Internet Governance
    Interoperability
    Ioc
    Irt
    Jurisdiction
    Justice
    Licensing
    Lobbying
    Loser Pays Model
    Morality And Public Order
    Mueller
    Multistakeholder
    Multistakeholder Participation
    Multistakholderism
    Naf
    Nairobi Treaty
    Ncsg
    Ncuc
    #netflix
    Network Neutrality
    New Gtld Applicant Guidebook
    New Gtlds
    New Kids On The Block
    Ngos
    Ninth Circuit
    Nominative Use
    Nominet
    Non-profits
    Not-for-profit
    Npoc
    Olympiad
    Olympic
    Online Infringement
    Online Infringement And Counterfeits Act
    Open Internet
    Paris Convention
    Pddrp
    Permissionless Innovation
    Phising
    Pipa
    Poll
    Ppdrp
    Preliminary Gnso Issue Report On The Current State Of The Udrp
    Procedural Justice
    Protect Act
    Protect Ip Act
    Public Policy
    Red Cross
    Registrant
    Registrars
    Regulation
    Review
    Rule Of Law
    Russia
    S.3804
    Scorecard
    Senate Bill S.3804
    Senate Hearing
    Senator Leahy
    Sopa
    Sovereignty
    Sti
    Stop Online Piracy Act
    #streaming
    Supplemental Rules
    Technological Sovereignty
    Tmc
    Trademark
    Trademark Bullying
    Trademark Clearinghouse
    Trademark Lobbying
    Trademark Owners
    Trademarks
    Transparency
    Udrp
    Urs
    Us Congress
    Us Department Of Commerce
    Uspto
    Wipo
    WSIS

Proudly powered by Weebly
  • About me...
  • Write. Share. Ignite.
  • Byline
  • Media
  • Books
  • "Internet of Humans" podcast