Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998 to address the rapid growth of online marketing techniques for children. With the introduction of banner advertising and targeted advertising, programmatic advertising has just started.
The website collects children’s personal data without their parents’ knowledge or their consent. Studies have shown that children do not understand the risks of disclosing personal information online.
Congress instructs the Federal Trade Commission (FTC) to implement COPPA and issue regulations to implement it. The FTC’s original COPPA rule came into effect in 2000, and the FTC began to enforce the rule on websites and video game operators.
Fast forward to the 2010s
With the development of technology, the Federal Trade Commission has filed lawsuits against manufacturers of mobile apps, connected toys, and Internet of Things devices. But it wasn’t until 2013, when the COPPA rule was revised, that ad technology companies started paying attention to it. At that time, the rules updated the definition of personal information to include “[a] A persistent identifier that can be used to identify users over time and across different websites or online services. ”
These identifiers (including cookies, IP addresses, processor or device serial numbers or unique device identifiers) are the targets used by online advertisers to deliver relevant advertisements to consumers. The expansion of COPPA rules greatly increases the potential liability of ad technology companies, which should be aware of the inherent shortcomings of collecting children’s information through other websites and applications.
The COPPA rules apply to operators of commercial websites and online services that collect, use, or disclose children’s personal information for children under 13 years of age. It also applies to operators of general audience websites or online services that actually know that they are collecting, using, or disclosing the personal information of children under 13 years of age.
This is a key part of advertising networks, platforms and other third parties: this rule also applies to operators of websites or online services with the following characteristics Actual knowledge They collect personal information directly from users other Websites or online services for children.
InMobi and Google’s FTC settlement
Since then, the FTC has filed more than 30 enforcement actions, accusing it of violating COPPA rules. Two of the FTC actions involved advertising networks. The first was an enforcement action against InMobi, an advertising platform for app developers and advertisers, in 2016.
The second (arguably more compelling) COPPA case filed by the US Federal Trade Commission against advertising networks/platforms is an enforcement action against Google and YouTube in 2019. In this case, the U.S. Federal Trade Commission accused YouTube of violating the rules by collecting persistent identifiers from viewers of child-oriented channels to deliver targeted advertisements without notifying parents in advance and obtaining their consent.
During the investigation, the US Federal Trade Commission found evidence that YouTube did know that children under the age of 13 were watching certain channels on its platform. For example, YouTube tells toy manufacturing companies that “YouTube is the leader of today’s top TV channels covering children aged 6-11” and that YouTube is “the number one website frequently visited by children”. Even when making these statements, YouTube claimed to be a general audience platform without any content aimed at children under 13 years of age.
By default, YouTube has targeted advertising enabled on its monetized channel. This means that YouTube is collecting cookies from users of child-oriented channels without notifying parents and obtaining their consent to collect this data. The settlement agreement requires YouTube to pay a fine of $170 million. YouTube was also required to implement a system that allows channel owners to identify content as child-directed content so that YouTube can ensure that it abides by the rules in the future.
Five things that advertising networks should do today
In this context, the following are the five important points of the InMobi and Google/YouTube settlement:
- Improve transparency. Advertising networks and platforms should consider implementing a system that allows online services (such as websites, apps, or channels) to identify to advertising networks/platforms that their content is child-oriented.
- Stop collecting data from children. Once an advertising network or platform has established a system in which developers can indicate that their applications are child-oriented, the advertising network needs to take measures not to collect personal information through these websites, applications or channels.
- Involve parents when needed. Even if the advertising network does not collect precise geographic location information from children, if it collects wireless network identifiers to infer the precise location, it also needs to provide notifications and obtain parental consent.
- Protect sensitive data. If the advertising network decides to collect children’s data, it must maintain the confidentiality, security, and integrity of the information. It should only retain data for as long as necessary to achieve the purpose of collecting the data. Advertising networks should delete data in a way that prevents unauthorized access or use.
- Continue to strictly protect children. If a platform or advertising network knows that a certain channel or application is geared toward children, it cannot collect personal information such as persistent identifiers to place targeted advertisements without prior notice and parental consent.
Bottom line: Advertising networks should avoid collecting information through apps that they know are aimed at children. For advertising networks or platforms, the safest thing to do is not to place targeted ads on child-oriented websites, apps, or channels in the first place.