EU Child Safety Law Lapses: Tech Giants Warn of "Irresponsible Failure"
A significant legal gap has emerged in the European Union following the expiration of a crucial "carve-out" law that allowed technology companies to scan for child sexual abuse material (CSAM) on their platforms. The law, part of the EU’s ePrivacy Directive, expired on April 3, 2026, after the European Parliament declined to vote for its extension citing privacy concerns.
In a rare joint statement, Google, Meta, Snap, and Microsoft condemned the lapse as an "irresponsible failure," warning that the inability to use automated detection tools will lead to a sharp rise in undetected grooming, sextortion, and the distribution of illegal content. Historical data supports these fears; a similar legislative gap in 2021 resulted in a 58% drop in abuse reports to the National Center for Missing and Exploited Children (NCMEC) over just 18 weeks.
The Privacy vs. Protection Debate
While privacy advocates argue that automated scanning—often referred to as "chat control"—risks mass surveillance and compromises data security, child safety experts emphasize that these tools use machine learning and "hashing" (digital fingerprinting) to identify known illegal content without storing private user data.
What Happens Now?
Legal Uncertainty: Companies are caught in a regulatory limbo; they are now prohibited from proactive scanning but remain liable for removing illegal content under the Digital Services Act (DSA).
Voluntary Efforts: Despite the lapse, major tech firms have pledged to continue voluntary scanning efforts to the best of their legal ability.
Ongoing Negotiations: The EU Parliament states that work on a permanent legal framework is ongoing, though no timeline for a resolution has been provided.
As perpetrators often operate across borders, experts warn that this legislative vacuum in Europe provides a "dark" space for offenders to target minors with reduced risk of detection.
#ChildSafety #OnlineSafety #EU #BigTech #DigitalRights $BSB $BAS $ARTX