magnifying glass google logo
Chesnot/Getty Images

  • Google is allowing anyone under the age of 18 to remove their images from search results.
  • Google originally announced increased protections for children and teens in August, promoting internet safety.
  • Google hopes this change will give younger people more control over their digital footprint.

Google is allowing anyone under the age of 18, or their parent or guardian, to remove their images from search results.

This policy change was enacted on Wednesday and is part of what the company says is its larger shift towards protecting younger users on its platforms. Google originally announced increased protections for children and teens in August, allowing younger internet users to have safer avenues on the internet. For example, YouTube, which is owned by Google, is making its "take a break" and bedtime reminders default for all users ages 13 to 17 and limiting the visibility of videos posted by its younger users.

All requests will be reviewed by Google and its team may reach out for additional information for verification if needed. Once the image removal request is approved, it will no longer appear in the images tab or as thumbnails in any feature in Google Search and submitters will receive a notification, according to a Google blog post explaining the feature.

Google did not respond to Insider's request to comment.

"We believe this change will help give young people more control over their digital footprint and where their images can be found on Search," the company's post said.

However, images that are removed from Google search results are not fully removed from the internet, Google warns. If users need an image removed online completely, Google recommends that users contact the site's webmaster where the image is hosted for removal.

Google offers other features for its younger users to protect them "from shocking or harmful content." Some of these features include SafeSearch that limits explicit and inappropriate inquiries, content filters, and educational resources.

On Tuesday, lawmakers met with representatives from Snapchat, TikTok, and YouTube to discuss child safety online. The Senate Commerce subcommittee on consumer protection, product safety, and data security asked questions about how the platforms have been misused by teenagers to promote dangerous and reckless behavior. None of the tech companies committed to any legislative proposals.

Read the original article on Business Insider