WHATSAPP’S chief has blasted Apple over its plans to check iPhone users’ photos for child sex abuse imagery.

In a series of tweets, Will Cathcart said that his messaging app would not be adopting the safety measures, calling Apple’s approach “very concerning”.

WhatsApp's chief has blasted Apple over its plans to check iPhone users' photos for child sex abuse imagery

2

WhatsApp’s chief has blasted Apple over its plans to check iPhone users’ photos for child sex abuse imageryCredit: Getty

Apple last week unveiled plans to scan U.S. iPhones for images of child sexual abuse.

The move has drawn applause from child protection groups but raised concerns among security researchers and tech experts.

Those concerned claim the system could be misused – particularly by governments who may be looking to spy on their citizens.

Following the unveiling of the plans on August 6, Cathcart tweeted: “This is the wrong approach and a setback for people’s privacy all over the world.

“People have asked if we’ll adopt this system for WhatsApp. The answer is no.”

NEURALMATCH

The tool, called neuralMatch, is designed to detect known images of child sexual abuse and will scan such images before they are uploaded to iCloud.

If the system finds a match, the image will be reviewed by a human.

Once child sex abuse content has been confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will, however, only flag images that are already in the center’s database of known child sex abuse images.

‘VERY CONCERNING’

Cathcart continued: “Child sexual abuse material [CSAM] and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.

“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.

“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.

“We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.”

GOVERNMENT PRESSURE

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.

Apple has used those to scan user files stored in its iCloud service – which is not as securely encrypted as its on-device data – for child sex abuse imagery.

The company has been under government pressure for years to allow for increased surveillance of encrypted data.

Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement.

“With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Meanwhile the Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s compromise on privacy protections a shocking about-face for users who have relied on the company’s leadership in privacy and security.

More than 6,000 people have signed an online petition to stop the plans, including security and privacy experts, researchers, legal experts and more.

SPYING CONCERNS

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography.

That could fool Apple’s algorithm and alert law enforcement, Green said.

He added that researchers have been able to trick such systems pretty easily.

Other abuses could include government surveillance of dissidents or protesters.

“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'” Green asked. “Does Apple say no? I hope they say no, but their technology wont say no.”

Will Cathcart said that WhatsApp would not be adopting the safety measures, calling Apple's approach 'very concerning'

2

Will Cathcart said that WhatsApp would not be adopting the safety measures, calling Apple’s approach ‘very concerning’Credit: Twitter

Best Phone and Gadget tips and hacks

Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…

Cyber security journalist Zak Doffman discusses WhatsApp’s HUGE loophole that lets people spy on you

In other news, a Google Maps fan has spotted a “secret” military base tucked away in the middle of the Sahara desert.

Samsung has teased a glimpse of the design for its highly anticipated Galaxy Z Fold 3 smartphone.

And, the next iPhone will come in a new pink colour and start at just under £800, according to recent rumours.


We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]


This post first appeared on Thesun.co.uk

You May Also Like

I’m a security expert – my phone call trick stops dangerous iPhone and Android ‘money-stealing’ attack

A SECURITY expert has revealed the simple rule to follow that can…

How to Protect Your Privacy When Working From Home

The cybersecurity world pays a lot of attention to protecting privacy. Privacy…

Pro-Russian hackers claim THEY were responsible for AT&T outage that left 70,000 stuck on SOS mode – but experts say claims are dubious

A Pro Russian hacktivist group have claimed responsibility for the AT&T outages…

Do you head ‘straight to the comments’ on MailOnline? Well, we’ve made them even better! Try out the new system and get more of your thoughts published – much quicker – TODAY

MailOnline is thrilled to announce an exciting innovation which will make our…