Meta’s WhatsApp messaging service, as well as the encrypted platform Signal, threatened to leave the UK over the proposals.

Ofcom’s proposed rules say that public platforms—those that aren’t encrypted—should use “hash matching” to identify CSAM. That technology, which is already used by Google and others, compares images to a preexisting database of illegal images using cryptographic hashes—essentially, encrypted identity codes. Advocates of the technology, including child protection NGOs, have argued that this preserves users’ privacy as it doesn’t mean actively looking at their images, merely comparing hashes. Critics say that it’s not necessarily effective, as it’s relatively easy to deceive the system. “You only have to change one pixel and the hash changes completely,” Alan Woodward, professor of cybersecurity at Surrey University, told WIRED in September, before the act became law.

It is unlikely that the same technology could be used in private, end-to-end encrypted communications without undermining those protections.

In 2021, Apple said it was building a “privacy preserving” CSAM detection tool for iCloud, based on hash matching. In December last year, it abandoned the initiative, later saying that scanning users’ private iCloud data would create security risks and “inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

Andy Yen, founder and CEO of Proton, which offers secure email, browsing and other services, says that discussions about the use of hash matching are a positive step “compared to where the Online Safety [Act] started.”

“While we still need clarity on the exact requirements for where hash matching will be required, this is a victory for privacy,” Yen says. But, he adds, “hash matching is not the privacy-protecting silver bullet that some might claim it is and we are concerned about the potential impacts on file sharing and storage services…Hash matching would be a fudge that poses other risks.”

The hash-matching rule would apply only to public services, not private messengers, according to Whitehead. But “for those [encrypted] services, what we are saying is: ‘Your safety duties still apply,’” she says. These platforms will have to deploy or develop “accredited” technology to limit the spread of CSAM, and further consultations will take place next year.

You May Also Like

Injecting a Gene Into Monkeys’ Brains Curbed Their Alcohol Use

When they conducted postmortem examinations of the monkeys’ brains, the team also…

Climate: Earth lost a record 28 TRILLION tonnes of ice between 1994 and 2017

A record-breaking 28 trillion tonnes of ice — enough to cover the…

Stunning GTA San Andreas footage shows game with lifelike 8K graphics for first time ever

A massive Grand Theft Auto fan has reimagined one of the franchise’s…

Glitch Unmasks Role of Networks Making Websites Faster

The use of remote networks to make websites run faster can also…