Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results. 

The tech giant launched a help page for such requests that not only lets minors request the removal of information, but also prevent information of themselves from appearing in Search and on a specific website – parents and guardians are also allowed to submit requests.

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement.

Users must also be 18 years old or younger for Google to approve the request, meaning if they are over the age they cannot apply to have images removed when they were a teenager.

Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results

Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results

Google first announced plans for the feature in August, adding it would be activated in the coming weeks.

It comes as major online platforms have long been under scrutiny from lawmakers and regulators over their sites’ impact on the safety, privacy and wellbeing of younger users.

Facebook and Instagram are currently under fire after a whistleblower shared how the platforms harm children.

Apple also announced in August that it would join the movement to protect minors by scanning users’ photos for child sexual abuse material.

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement

However, the act was not welcomed with open arms – critics said it was an invasion of privacy that forced Apple to delay the launch. 

Mindy Brooks, Google’s general manager for kids and families, wrote in an August blog post: ‘Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally.’

In order to request removal of an image, the help form requires users provide image URLs for the specific content.

Users will then receive an automated email confirmation after submitting their request, which means Google will review the removal request by ‘gathering more information.’

Following that, a notification of action taken will be sent. 

‘If the request doesn’t meet the requirements for removal, we’ll also include a brief explanation. If your request is denied and later you have additional materials to support your case, you can re-submit your request,’ according to Google.

Google’s new feature also helps users report child sexual abuse imagery to the National Center for Missing and Exploited Children or an organization in a provided list that is based on the user’s geography.

And the company will review requests in the tragic situation of a child who has died before reaching the age of 18.

This post first appeared on Dailymail.co.uk

You May Also Like

Xfinity 500 driver pulls inspiration from NASCAR 2005 video game to secure spot in the championship

A race car driver in peril of being eliminated from Sunday’s Xfinity…

WhatsApp users must delete two rogue Android apps NOW before they steal your private chats

ANDROID owners are being urged to check their phones for two rogue…

Fossil fuel firms are quietly planning almost 200 ‘carbon bomb’ oil and gas projects

The world’s biggest fossil fuel companies are planning almost 200 ‘carbon bomb’…

Revealed: The most common dangerous exotic animals kept as pets in the UK – including 819 wild boar, 312 ostriches and 60 cobras

For most, pet owners a dog or a cat is already wild…