PARENTS have accused Meta of placing kids at risk after making a major change to WhatsApp rules

The social media giant lowered the age limit for the messaging app on Friday from 16 to 13 – with campaigners calling the move “tone deaf”.

The WhatsApp app icon as seen on the screen of a smart phone

3

The WhatsApp app icon as seen on the screen of a smart phoneCredit: Alamy
Campaigners said Met was sacrificing children's safety for profits (stock image, posed for by model)

3

Campaigners said Met was sacrificing children’s safety for profits (stock image, posed for by model)Credit: Getty

Criticism has come from across civil society with MPs, academics, teachers saying it is a bad decision.

The campaign group Smartphone Free Childhood has signed up 60,000 parents who oppose the move.

Daisy Greenwell, the campaign’s co-founder, told The Telegraph the policy boosted Meta’s profits at the expense of children’s safety.

She said: “WhatsApp are putting shareholder profits first and children’s safety second. Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike.

Read more on tech

Vicky Ford, a Tory member of the Commons’ education committee, told The Times: “Social media can be very damaging for young people. WhatsApp, because it’s end-to-end encrypted, is potentially even more dangerous, as illegal content cannot be easily removed.

“So for Meta to unilaterally decide to reduce the age recommendation for WhatsApp, without listening to affected parents, seems to me to be highly irresponsible.”

Kaitlyn Regehr, a researcher at University College London, said: “Private, or closed, groups can enable more extreme material being shared, which in turn can have implications for young people’s offline behaviours.

“Young people increasingly exist within digital echo chambers, which can normalise harmful rhetoric.”

Most read in Tech

A Meta spokesperson said: “We give all users options to control who can add them to groups and the first time you receive a message from an unknown number we give you the option to block and report the account.”

The change is understood to be designed to bring the age requirement in line with other countries around the world.

Meta this week unveiled a range of new safety features designed to protect users from “sextortion” and intimate image abuse.

It will begin testing a filter in Direct Messages (DMs) on Instagram, called Nudity Protection.

The feature will be turned on by default for those aged under 18 and will automatically blur images sent to users which are detected as containing nudity.

When receiving nude images, users will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.

The decision to make the change was announced in February.

Best WhatsApp tips and hack

Wondering how to get the most out of WhatsApp? Read on to find out about all the hidden features, tips and hacks for the social media platform…

Researchers said extreme material can be sent through the app

3

Researchers said extreme material can be sent through the appCredit: Getty

This post first appeared on Thesun.co.uk

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

You can now reserve ‘sold out’ Xbox Series X directly from Microsoft in NEW queue system – here’s how

MICROSOFT is finally giving gamers a chance to reserve an Xbox Series…

Xbox fans are just realising you can give a friend Game Pass for free – here’s how

XBOX has introduced a new referral system which will allow subscribers to…

It’s the Perfect Time to Switch to a Cheaper Phone Plan

Visible just got an update to 5G, if Verizon supports it in…

Social Media Giants Should Pay Up for Allowing Misinformation

The 2020 US elections, the Covid-19 pandemic, the ongoing impasse between the…