On policy and product fronts, children’s privacy is taking the national stage. In understanding the market and legal forces at work, a striking vision for general consumer privacy comes into focus.
On policy and product fronts, children’s privacy is taking the national stage. In understanding the market and legal forces at work, a striking vision for general consumer privacy comes into focus.
For all of the complex challenges of data privacy (see: the increasingly maze-like state of EU-US data transfers), some topics seem much more intuitive to wrap my brain around, like children’s privacy protections. In recent weeks, a bipartisan proposal for revamped children’s privacy protections has come to the US Senate, shortly after 44 states’ attorneys general issued a letter urging Facebook to stop development of its new Instagram for Kids, citing–among other vital concerns–threats to children’s privacy protections.
As I previously wrote, in the absence of comprehensive US privacy regulation, large tech companies are setting their own privacy standards in large part to consolidate market power. A similar motive drives their development of children’s-specific products: to tap into a new, loyal user-base. However, existing and upcoming legislation on children’s privacy pose a significant hurdle to tech companies, limiting their ability to set their own rules in this space. It suggests a striking vision of what a cohesive privacy framework could provide to Americans at large.
This spring, I wrote about how large tech corporations—Apple and Google, in particular—have been the primary drivers of privacy changes in 2021, with their respective privacy updates giving them a competitive advantage. And these discussions linking privacy and competition are growing in prominence, even in the past couple weeks. Not for lack of new material to discuss, either: Apple’s overwhelmingly adopted iOS 14.5 privacy update in late April is giving Apple an edge in the advertising space; and Google’s latest updates to its analytics platform drives up the value of first-party data, a commodity that Google has few rivals in.
For decades, state and federal laws have encoded privacy protections specific to children. And, on a basic level, that makes sense: especially in online settings, children could inadvertently put themselves in danger without some guardrails. Children’s privacy and the related realm of student privacy are well-documented, expansive fields, and I home in on one law below. For more information on youth privacy, I suggest you check out resources from the Future of Privacy Forum and its Student Privacy Compass project.
One of the most prominent children’s privacy laws in the US is the Children’s Online Privacy Protection Act (COPPA), which took effect in 2000. COPPA’s provisions apply to websites as well as other digital services, like mobile apps. Among its requirements, COPPA demands that websites only collect, use, and disclose children’s personal information with explicit consent from a parent. The law also calls for direct and reasonable means to notify parents of data practices, as well as data minimization: digital services should only retain the necessary data on a child for only the time period that it is necessary.
In aiming to align with children’s privacy requirements, some companies have created kids-specific offshoots of their products. For instance, Google has YouTube Kids, a platform that aims to not only offer kid-friendly content but also comply with compliant data practices regarding paid advertising. It’s not just the interface and content that differ from YouTube proper. YouTube Kids has a distinct Privacy Notice, and it prohibits targeted advertising, a practice that’s plainly available on YouTube.
However, YouTube and its parent company Google landed a $170 million fine in 2019 from the Federal Trade Commission for violating consent requirements in collecting children’s data in YouTube proper. For reference, the largest GDPR fine issued thus far was just under $57 million. Nevertheless, YouTube Kids was the most-used app for video streaming in the US, Europe, and globally in early 2020.
Facebook released Messenger for Kids in apparent compliance with COPPA, with separate policy and product builds to protect children’s data. But the product had a design flaw enabling chats between children and unauthorized users—effectively strangers. The app has about 1.4 million users in 2021. Most recently, Facebook has announced Instagram for Kids, which would aim to implement the necessary consent mechanisms to comply with COPPA and other relevant laws. And the response to this announcement intersects with new children’s privacy legislation.
Products specific to children have seen widespread adoption by their target users, but they don’t always hold up to regulatory or technical scrutiny. They are generally not monetizable in the way their default versions are, but business interests remain strong in promoting these products. Namely, children’s versions establish loyal user-bases that persist into the monetizable versions. This market driver is particularly salient given, for instance, how Instagram’s US presence is projected to plateau in the next few years.
Unlike the case with general consumer privacy, a robust framework for children’s privacy is appearing to keep Big Tech from setting its own rules of play in reaching for new, younger audiences.
Earlier last month, Senators Markey and Cassidy introduced the Children’s and Teen’s Online Privacy Protection Act, which would strengthen COPPA and expand its scope. For instance, it would extend COPPA consent requirements to protect the data of children 13 to 15 years of age. It also calls for a ban on targeted advertising to children, and an “Eraser Button”: a straightforward tool for parents and children to request data deletion.
The bipartisan Senate legislation appeared just a day before 44 attorneys general called on Facebook to stop the development of Instagram for Kids. Independent of whether you think Instagram for Kids is a good idea, US lawmakers are moving with surprising cohesion in this area of privacy. It begs the question of what US privacy at large could be if we had as clear of standards for general end-users as we do for children.
This is not a call to copy-and-paste COPPA into a general consumer privacy bill. Some COPPA provisions, like parental consent for data collection, make sense for children but not for adults. Rather, we should aspire to the level of cohesion with general consumer privacy that we do with children’s privacy. In doing so, we could significantly boost Americans’ trust in trustworthy systems, establishing clear privacy standards to cut through the current confusion. We have a long way to go, with lawmakers split at the federal and state levels over numerous aspects of privacy legislation, from preemption of state laws to a private right of action. Of the more than twenty state-level privacy bills introduced in 2021, one has passed. And federal lawmakers have not yet passed consumer privacy legislation.
There is a general consensus that children need stronger privacy protections than adults do, partly on the basis that they simply cannot gauge the risk and complexity involved in the digital systems underpinning their video or gaming apps. I agree with this stance, yet I doubt that adults—myself included—can always gauge the systems we use. For instance, the majority of executives cannot explain how their own companies’ AI models work. It is not fair to leave end-users to their own devices in protecting their privacy, and it remains for us to codify informed control over our data: privacy controls that end-users understand. It is immediately clear, though, that a privacy free-for-all, where any company can set its privacy standards, is inappropriate for all ages.
Today we’re announcing faster and more powerful Data Privacy and AI Governance support
See new feature releases enhancing user experience, adding new integrations and support for IAB GPP
Learn more about the privacy and data governance enhancements in Fides 2.27 here.
Read Ethyca’s CEO Cillian Kieran describe why and how an open data governance ontology enables companies to comply with data privacy regulations and frameworks.
Ethyca sponsored the Unpacking Privacy Engineering for Lawyers webinar for the Interactive Advertising Bureau (IAB) on December 14, 2023. Our CEO Cillian Kieran moderated the event and ran a practical discussion about how lawyers and engineers can work together to solve the technical challenges of privacy compliance. Read a summary of the webinar here.
Ethyca’s CEO Cillian Kieran hosted a LinkedIn Live about the newly agreed upon EU AI Act. Read a summary of his talk and find a link to his slides on what governance, data, and engineering teams need to do to comply with the AI Act’s technical risk assessment and data governance requirements.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!
Request a Demo