Google is seeking to strengthen security measures for minors on the Internet with measures around personal data and inappropriate content, according to the Internet giant, whose platform, YouTube, is often criticized in the field of child protection.
Mindy Brooks, general manager of Google’s Children and Families division, said: “As young people spend more time online, more parents, teachers, privacy experts and policy officials are seeking to ensure safe surfing for children and teens. They are right.”
“We are in constant contact with them, to regularly adapt our products and control tools aimed at the young audience,” she said.
For example, the recordings posted on YouTube by teenagers between 13 and 17 years will automatically become in a “special” format. If this feature is not disabled, only users selected by the registrant will be able to view the video.
It is also possible for minors or their parents to request that their photos be removed from the search results on the Google Photos section.
The issue of the removal of problematic content in general, from false information to offensive images, at the request of authorities or individuals, is a matter of contention for the platform. With regard to geographical data, the history of past geolocations will be invalidated for all under-18s worldwide, without being able to play it again. To protect them from “inappropriate” content, the company will turn on the “Safe Search” feature on its search engine for all minors. Promoters will no longer have the right to target minors with advertisements based on their age, gender or interests.