Security researchers fear neuralMatch system could be misused to spy on citizens

Apple has unveiled plans to scan its iPhones in the US for images of child sexual abuse, drawing praise from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called neuralMatch, will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child abuse is confirmed, the user’s account will be disabled and the US National Center for Missing and Exploited Children notified.

Continue reading…

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Russia-Ukraine war live news: voting in ‘sham’ referendums due to end; Japanese consul ‘interrogated’ in Russia

Days of voting in provinces of Luhansk, Donetsk, Kherson and Zaporizhzhia to…

Oscars 2024: the winners, the ceremony, the red carpet – follow the action live!

Will Oppenheimer triumph? Will Ryan Gosling really sing alongside 65 Kens? We’re…

‘What’s that Skip?’ Researchers say kangaroos can communicate with people

Study shows animals with no long history of domestication show patterns of…

Devils – Canadiens

devils vs canadiens