Campaigners are ‘astonished’ by a loophole in the government’s Online Safety Bill that could put pornography websites ‘outside its scope’.
The Bill, which was published as a draft on May 12, only applies to sites or services that allow ‘user interactivity’ – in other words, sites allowing interactions between users or allowing users to upload content, like Facebook.
Commercial pornography sites, such as Pornhub and YouPorn, could therefore ‘put themselves outside of the scope of the Bill’ by removing all user-generated content.
Porn sites have a combination of user-generated content (uploaded by people who visit the sites themselves) and content created by official partners.
The UK government published the draft Online Safety Bill on May 12. The legislation will help keep children safe online and combat racism and other abuse on platforms like Facebook
The government is set to introduce its Online Safety Bill later this year, which will enforce stricter regulation around protecting young people online and harsh punishments for platforms found to be failing to meet a duty of care.
The Bill will require social media and other platforms to remove and limit harmful content, with large fines for failing to protect users, enforced by regular Ofcom.
But the problem with the Bill is it focuses on the issue of kids ‘stumbling’ across pornography on social media – not children who start to look for it on dedicated porn sites.
‘I was astonished that by drawing the Bill so narrowly they actually excluded the world’s most prolific producers and distributors of pornography,’ said John Carr, secretary of the Children’s Charities’ Coalition on Internet Safety.
‘Pornhub, xHamster, all of the big commercial pornography sites – the largest single source of pornography in the world – are outside the scope of the Bill or could easily put themselves outside of the scope of the Bill,’ he said to the BBC.
‘They could do that and it would not affect their core business model in any way, shape or form.’
In his blog, Carr also disagrees with the government’s assertion that children intentionally seeking out pornography do so ‘predominantly through social media’ – something he calls ‘simply untrue’.
Pornhub has already demonstrated that it can rely on content that hasn’t been generated by its users.
In December, the site removed all previously uploaded content that was not created by official content partners or users signed up to its Model Program initiative (who earn ad revenue from their videos), following allegations it was ‘infested’ with videos of rape and underage sex.
In December 2020, Pornhub removed all previously uploaded content that was not created by official content partners or users signed up to its Model Program initiative (who earn ad revenue from their videos)
‘The issue is that an adult website with no user generated content will not be affected by the Online Safety Bill unless it is amended to extend its scope,’ Iain Corby, director, Age Verification Providers Association, told MailOnline.
The loophole affects the issue of age verification, which will be required under the Bill for all ‘user-to-user services’ (defined as those where users can share user-generated content) if it is 1) likely to be accessed by children and 2) likely to have adverse ‘physical or psychological impact[s] on a child of ordinary sensibilities’.
‘Pornhub currently offers user-to-user services, so is in-scope [of the Bill],’ Corby told MailOnline.
‘It is likely to be accessed by children – we know it is in fact accessed by children from plenty of research.
‘And it is hard to argue its content does not have significant adverse mental impact on children – particularly younger children or where the content is hardcore.
‘But if it drops its user-generated content, [stipulation one and two] don’t matter – it is out of scope.’
The idea of implementing age checks on pornography websites, and fining those sites that don’t comply, has existed for several years now.
Back in 2016, the UK government launched a public consultation over plans to implement age checks on pornography sites.
It was then included in the Digital Economy Act 2017 – but the provision was delayed and eventually abandoned in October 2019.
Government said at the time age checks would be delivered through its ‘proposed online harms regulatory regime’ – in other words, the Online Safety Bill.
The Online Safety Bill will require social media and other platforms to remove and limit harmful content, with large fines for failing to protect users (stock photo)
Companies have since invested in age-checking technology that would let them comply with the legal requirement.
For example, AgeID was set up by MindGeek (the Canadian firm that owns Pornhub, RedTube, and YouPorn, among other adult sites) to let adult users verify their age.
While age verification checks were absent from last week’s draft Online Safety Bill, experts think it has simply been put aside for now.
The Support Network for Adult Professionals told the BBC in a statement: ‘We believe Brexit and Covid made the government put it to one side.
‘We think that some businesses are pleased it’s not in yet,’ the professional body added, likely because they get extra time to prepare.
One criticism of age-checking technology for porn is regarding concerns about handing sensitive identification information – namely age or date of birth – to third parties.
‘Everyone realised right from the start – 2016 – that users were not going to want to share their name, let alone a copy of their passport or driving licence, with a porn site,’ said Corby.