Security researchers fear neuralMatch system could be misused to spy on citizens

Apple has unveiled plans to scan its iPhones in the US for images of child sexual abuse, drawing praise from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called neuralMatch, will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child abuse is confirmed, the user’s account will be disabled and the US National Center for Missing and Exploited Children notified.

Continue reading…

You May Also Like

Tropical storm Megi: Philippines death toll rises to 80 as landslides bury villages

Focus now on retrieving bodies, says mayor, after strongest storm to hit…

23 Best Black Friday Deals on Outdoor Gear (2023): REI, Garmin, and More

Black Friday is traditionally the season when you shop for televisions or…

Serving Met officer ‘raped woman on his stag night’, court told

Sgt Laurence Knight met woman in Brighton last July and allegedly raped…