BRUSSELS (AP) — European Union lawmakers on Tuesday adopted a series of amendments to a draft law to fight online child pornography as they tried to find the right balance between protecting children and protecting privacy.
Under the the draft position adopted by the Committee on Civil Liberties, Justice and Home Affairs of the Parliament, internet providers will have to assess the risk of their services being used for online child sexual abuse, and to take measures to mitigate these threats.
But to “avoid generalized monitoring of the internet,” lawmakers proposed excluding end-to-end encrypted material from detection, while making sure time-limited detection orders approved by courts can be used to hunt down illegal material when mitigation actions are not sufficient.
They said they “want mitigation measures to be targeted, proportionate and effective, and providers should be able to decide which ones to use.”
Their position now needs to be endorsed by the whole Parliament before further negotiations involving EU member countries can take place.
Reports of online child sexual abuse in the 27-nation bloc have increased from 23,000 in 2010 to more than 1 million in 2020. A similar increase has been noticed globally, with reports of child abuse on the internet rising from 1 million to almost 22 million during 2014-2020 and over 65 million images and videos of children being sexually abused identified.
The European Commission proposed last year to force online platforms operating in the EU to detect, report and remove the material. Voluntary detection is currently the norm and the Commission believes that the system does not adequately protect children since many companies don’t do the identification work.
Digital rights groups had immediately warned that the Commission’s proposal appeared to call for widespread scanning of private communications and would discourage companies from providing end-to-end encryption services, which scramble messages so they’re unreadable by anyone else and are used by chat apps Signal and WhatsApp.
The Computer and Communications Industry Association, a big tech lobbying group, praised the committee’s proposed measures that “narrow scanning obligations, safeguard end-to-end encryption of communications and strengthen more targeted mitigation measures.”
“Indeed, the ‘cascade approach’ adopted by Parliament would first have online service providers assess risks and then take action to mitigate those,” the group said. “The tech industry commends this approach, just like the important clarification that detection orders will only be issued as a last-resort measure by a competent judicial authority, and have to be targeted and limited.”
The Parliament committee also wants pornography sites to implement appropriate age verification systems, mechanisms for flagging child sexual abuse material and human content moderation to process these reports.
“To stop minors being solicited online, MEPs propose that services targeting children should require by default user consent for unsolicited messages, have blocking and muting options, and boost parental controls,” the Parliament said in a statement.
To help providers better identify abuse, the Commission had proposed the creation of an EU Center on Child Sexual Abuse, similar to the National Center for Missing and Exploited Children, a U.S. nonprofit reference center that helps families and exploited victims.
Lawmakers approved the idea. The center would work with national authorities and Europol to implement the new rules and help providers to detect abuse materials online.
“The center would also support national authorities as they enforce the new child sexual abuse rulebook, conduct investigations and levy fines of up to 6% of worldwide turnover for non-compliance,” they said.