Share: twitterTweet facebookShare

Google Unveils Policy Changes Aimed at Protecting Minors on Company’s Platforms

Share: twitterTweet facebookShare

Google on Tuesday unveiled a handful of policy changes aimed at protecting people under 18 from abuse on the search giant’s platforms.

According to blogs posted Tuesday, the company will allow minors or their parents to request to have their pictures removed from Google’s Image Search feature, a notable change because Google has historically taken a hands-off approach when it comes to managing its search engine. Google also said it will block targeted advertising based on the age, gender or interests of people under 18.

YouTube, which is owned by Google, said it will change the default video upload settings for minors, automatically choosing the most private option available. The platform will also turn off autoplay by default for minors and turn on digital well-being tools, like alerts that remind people to take a break when they’ve been binging videos for a long time.

“We want to help younger users make informed decisions about their online footprint and digital privacy, including encouraging them to make an intentional choice if they’d like to make their content public,” James Beser, YouTube director of product management wrote in a blog post.

Google changes in the coming months include:

  • A new policy allowing anyone under 18, or their parent or guardian, to request removal of their images from Google Image search results.
  • Blocking ad targeting based on the age, gender or interests of people under 18.
  • Users under 18 will not have the ability to turn on location history (which is off by default).
  • SafeSearch, which filters out explicit results, will be turned on for existing users under 18 and will be the default setting for teens creating new accounts.
  • Apps will be required to disclose how they use data as part of a new safety section for Google Play, which will also highlight which apps follow Google’s family policies.

Meanwhile, YouTube changes include:

  • Removing “overly commercial” videos from YouTube Kids, which YouTube says could be content that focuses solely on product packaging or “directly encourages” kids to spend money.
  • Adjusting the default upload setting to the most private option for users between 13 and 17, with private uploads only being seen by the user and whoever they choose.
  • Turning on “take a break” and bedtime reminders by default for users 13-17, and turning autoplay off by default for the group (although they can turn it back on).
  • Adding an autoplay option for YouTube Kids, but turning it off as the default setting in the app.

In addition to the product and policy changes, Google said it is developing new informational resources for young people and their parents to help them better understand what data is being collected, why and how it is used.

The moves come after years of criticism from government officials, parents, and advocacy groups.

Share: twitterTweet facebookShare