Company says it will ‘collect input and make improvements’ after backlash from privacy groups

Apple will delay its plans to begin scanning user images for child sexual abuse material (CSAM) before uploading them to the cloud, the company says, after a backlash from privacy groups.

The company’s proposal, first revealed in August, involved a new technique it had developed called “perceptual hashing” to compare photos with known images of child abuse when users opted to upload them to the cloud. If the company detected enough matches, it would manually review the images, before flagging the user account to law enforcement.

Continue reading…

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Subway shooting suspect

brooklyn shooter

Lasers, X-rays and infrared are helping to discover the secrets of ancient Egyptian mummies

Modern technology is ‘shining a light’ on what daily life was like…

Cancelled shows, opening goals and easy paydays – take the Thursday quiz

Fifteen questions on general knowledge and topical trivia, plus a few jokes,…

Eliud Kipchoge takes 30 seconds off his men’s marathon world record in Berlin

Kenyan takes 30 seconds off record with run of 2:01:09 Kipchoge has…