Company says it will ‘collect input and make improvements’ after backlash from privacy groups

Apple will delay its plans to begin scanning user images for child sexual abuse material (CSAM) before uploading them to the cloud, the company says, after a backlash from privacy groups.

The company’s proposal, first revealed in August, involved a new technique it had developed called “perceptual hashing” to compare photos with known images of child abuse when users opted to upload them to the cloud. If the company detected enough matches, it would manually review the images, before flagging the user account to law enforcement.

Continue reading…

You May Also Like

How Two Real-Estate Outsiders Landed Jobs Inside Luxury Condo Buildings

Resume Subscription We are delighted that you’d like to resume your subscription.…

104.9 KiSS FM

kiss radio vancouver, kiss fm vancouver, 104.9 kiss radio, kiss fm, 104.9…

Best Sex Toys & Tech for Every Body: Vibrators, Wand Massagers, Etc

If you’re new to sex toys, shopping for your first one can…

‘By the time I was 30 I’d lived in more than 25 places – including a car showroom’

After three decades of pebbledash, mould and evictions, a serial renter wonders…