In a new legislative framework released by Google titled “Legislative Framework to Protect Children and Teens Online,” the tech giant argues against proposed laws that would require online services to implement age verification measures. Google contends that these policies would lead to trade-offs and potentially restrict access to crucial information.
Key Takeaway
Google opposes legally mandated age verification for minors, advocating instead for age-appropriate design principles and prioritizing the best interests of children and teens. They support the ban on personalized advertising for minors and emphasize the need to protect young users online.
Challenging Age Verification Policies
Google dismisses the idea of mandating age verification for users before granting them access to online platforms. Instead, the company asserts that online services should prioritize the best interests of children and teens by designing age-appropriate products and services. They believe that legislation should be based on age-appropriate design principles to balance safety and privacy concerns.
While Google supports the use of “data-intrusive methods” like verification with government-issued identification for high-risk services such as alcohol, gambling, or pornography, they caution against unnecessarily collecting sensitive personal information or blocking access to critical services.
The FTC, YouTube, and Personalized Advertising
Google’s framework comes four years after YouTube’s parent company, Google, was fined $170 million by the Federal Trade Commission (FTC) for violating children’s privacy. The FTC found that YouTube illegally collected personal information from children and used it for targeted advertising.
As part of the settlement, YouTube implemented a system for channel owners to identify child-directed content to avoid placing targeted ads in those videos. However, Google’s framework highlights the need for legislation banning personalized advertising for children and teenagers. Senator Ed Markey has recently reintroduced the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), which aims to prohibit targeted ads to minors.
Google argues that legislation should ban personalized advertising, including personalization based on age, gender, or interests, for those under 18. The company emphasizes the importance of protecting children and teens from potentially exploitative practices online.
Addressing Concerns
Despite Google’s stance, a recent report by Adalytics alleges that YouTube continues to serve targeted ads to minors, prompting Senators Marsha Blackburn and Ed Markey to request an investigation by the FTC. In response, Google criticized the report, describing it as “deeply flawed and uninformed.”