British authorities will be given the power to order tech companies to redesign their platforms and impose fines if they fail to control child sexual abuse material under a new online safety law.
The rules will target end-to-end encrypted platforms, where messages can only be seen by the sender and recipient, who are under increasing political pressure to give governments and law enforcement access to content, including messages, photos and videos.
The Home office announced on Wednesday a change to internet security laws, allowing communications regulator Ofcom to fine tech companies £18m or 10 per cent of their annual turnover, whichever is greater, if they fail to meet yet-to-be-defined child protection standards.
Under the proposals, the regulator could be allowed to order tech companies to install software yet to be developed into encrypted platforms or to develop their own technologies to detect inappropriate material.
The move comes as tech companies seek to strike a balance between protecting the privacy of their users’ data and protecting vulnerable users, while dealing with law enforcement and lawmakers who cannot see content on encrypted platforms.
Apple has already tried to introduce scanning software to crack down on harmful images of child sexual abuse, but was forced row back after a fierce backlash from privacy activists last year.
Meanwhile, Meta, which owns Facebook, Instagram and WhatsApp, has pledged to introduce end-to-end encryption on Facebook Messenger, something the Home Office and charities already have lobbied against in the name of children’s safety.
In a public submission to the bill committee last month, the company said it had concerns about how Ofcom’s ability to require messages to be scanned for inappropriate material would work. “It is unclear how this would be possible in an encrypted messaging service and would have significant privacy, security and safety implications for users,” wrote Richard Earley, Meta UK’s public policy manager.
Under the law, Ofcom will decide whether platforms are doing enough to prevent, detect and remove explicit material and whether it is necessary and proportionate to ask platforms to change their products.
“Privacy and security are not mutually exclusive – we need both and we can have both and that is what this amendment does,” said Home Secretary Priti Patel.
The government has awarded five projects across the UK with more than £550,000 to develop technologies to stop the spread of child abuse material, which platforms could be instructed to use in their products in the future.
These include external software that can be integrated into existing encrypted platforms, as well as age-verification technology that could be used before users access encrypted services.
Figures released by children’s charity the NSPCC on Wednesday suggest that online crime has jumped by more than 80 per cent in four years in the UK to an average of around 120 offenses a week.
Met-owned platforms were used in 38 percent of cases where the means of communication were known, and Snapchat in 33 percent.