Experts say case highlights well-known dangers of automated detection of child sexual abuse images

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.

Continue reading…

You May Also Like

The Rise of Rust, the ‘Viral’ Secure Programming Language That’s Taking Over Tech

Whether you run IT for a massive organization or simply own a…

People are just realizing the frightening reality of what’d happen to earth if it lost oxygen for just five SECONDS

A new video is making seismic waves across social media, revealing the…

‘We’re angry’: Israel tensions mount as army reservists threaten to refuse duty

Conflict over Netanyahu’s plans to overhaul judiciary is leading to new levels…

Dr Peter Scott-Morgan dead: Brit scientist who became the world’s first full ‘cyborg’ dies aged 64 after MND battle

A BRITISH scientist who became the world’s first full “cyborg” has died.…