Expressing excitement for his soon-to-arrive “new material,” a man shared with his online network of child sexual abusers an in-utero picture of his unborn child. This is just one of far too many horror stories I have heard from investigators at the Department of Justice’s Child Exploitation and Obscenity Section.

Children as young as 8, 4, and 2, and, increasingly more often, pre-verbal infants, are subject to horrific, unspeakable, and gut-wrenching sexual abuse that is then broadcast to a global audience. Last year alone, the National Center for Missing and Exploited Children received nearly 17 million reports to their CyberTipline. This haul exceeded 27 million images and 41 million videos. The average victim is 8 years old. Sadly, these reports constitute only a fraction of the global child sexual abuse trade.

WIRED OPINION

ABOUT

Hany Farid is a professor at UC Berkeley specializing in digital forensics and internet-scale content moderation.

For the past decade, a dynamic group of researchers, child-safety advocates, legislators, and technology sector experts have been working ardently to develop and deploy technology to protect children online. Among our many efforts includes the widely implemented and effective photoDNA program that was launched in 2008 and is today used globally to find and remove child sexual abuse material (CSAM). This program extracts a distinct signature from uploaded images and compares it against the signatures of known harmful or illegal content. Flagged content can then be instantaneously removed and reported.

Frustratingly, for the past decade, the technology sector has been largely obstructionist and full of naysayers when it comes to deploying new technologies to protect us. As a result of this deliberate neglect, the internet is overrun with child sexual abuse material, illegal sex trade, nonconsensual pornography, hate and terrorism, illegal drugs, illegal weapons, and rampant misinformation designed to sow civil unrest and interfere with democratic elections. This is the landscape facing us as we consider the Senate Judiciary’s Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT).

Section 230 of the 1996 Communications Decency Act established that—with only a few exceptions—interactive computer services (e.g., Facebook, Twitter, YouTube) are not liable for user-generated content. This act has given Silicon Valley an unprecedented gift in the form of a broad shield from accountability. Rather than acting as responsible “Good Samaritans” as Section 230’s drafters intended, technology companies have allowed for their services to be weaponized against children, civil society, and democracy, all the while profiting annually to the tune of billions of dollars.

The EARN IT Act is the culmination of years of cajoling, pleading, and threatening the technology sector to get its house in order. They have not, and so now is the time for legislation to rein them in. In its original form, this act would have established a commission tasked with outlining best practices for responding to the global pandemic of the online sexual exploitation of children. Failing to implement these practices would mean that platforms would lose some Section 230 liability protection. An amendment earlier this week, however, made implementation of the commission’s recommendations voluntary, undercutting Silicon Valley’s talking points about encryption, threats to the fourth amendment, and overreach by the Attorney General. In its amended form the Act leans into the skepticism of Section 230 and takes the needed step of fully removing blanket immunity from federal civil, state criminal, and state civil CSAM laws. In so doing, technology platforms will be treated like other entities when it comes to combating child sexual exploitation.

The EARN IT Act was unanimously approved on Thursday by the Senate Judiciary and will now be taken up by the full Senate. Despite overblown claims to the contrary, this act does not dismantle Section 230’s legal shield. The act is narrow but important. It will begin to realign the technology sector with other sectors that are routinely held to regulatory oversight, and are held liable when their products or services enable or cause harm.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Your Money Is Funding Fossil Fuels Without You Knowing It

When you drop money in the bank, it looks like it’s just…

EXCLUSIVE: Men are injecting lip filler into their PENISES to add inches of extra girth

After overhearing his wife making fun of his manhood during the breakdown…

Drone ‘bus’ able to carry 40 people from NYC to the Hamptons for just $85 in 2024

While Uber Elevate plans to launch an air taxi service for up…

Inside Elon Musk’s plans to launch private space station Starlab in ‘late 2020s’ with mega-rocket Starship

ELON MUSK’S Starship mega-rocket has been selected to launch a new private…