Serge Egelman

While the US currently has no comprehensive privacy law, the Children’s Online Privacy Protection Act (COPPA) has been in effect for over twenty years. As a result, the study of compliance issues amongst child-directed online services yields important lessons for future enforcement efforts. From studying the privacy behaviors of online services in this ecosystem and the software developers who build them, it is clear that many privacy issues are unintentional and caused by the uninformed use of third-party software components (”software development kits” or “SDKs”). Thus, better guidance to developers from platforms and third-party data recipients would likely prevent many privacy issues. Unfortunately, current policies disincentivize these entities—the ones best positioned to have impact—from taking action. In this article, I describe the research my laboratory has conducted to understand privacy compliance issues and how that has led to my recommendations for how privacy enforcement can be improved.
 
 We built tools that allow us to monitor the execution and transmissions of mobile apps. We used these tools to audit the COPPA compliance of nearly 6,000 child-directed apps. We found evidence that a majority of the apps were violating COPPA in various ways. We also found that the “safe harbor” self-regulation program had no observable effect on apps’ rates of compliance. Yet, the various potential violations that we observed were all due to the use of third-party components. In subsequent research, we observed that app developers are often legitimately unaware that these issues exist in their apps or that they are due to unexpected and/or undocumented behaviors caused by the SDKs they use. Thus, simply informing developers of these issues (and the fact that they are associated with legal jeopardy) could be a powerful first step towards remediation. Many app developers already look to platforms for this guidance, and as a result, platforms are best positioned to provide it.
 
 Despite evidence of literally thousands of COPPA violations, few enforcement efforts are pursued each year by the FTC and/or state attorneys general, the only entities with enforcement authority under the law. Despite having competent personnel, these organizations are heavily constrained and under-resourced, and as a result, enforcement by regulators is simply not seen as a credible threat by app developers; our research found that developers are much more concerned with getting kicked out of app stores. Thus, the platforms could audit submitted apps to determine whether they correctly use SDKs’ relevant privacy configuration directives, rejecting apps (with detailed remediation instructions) that do not. The third-party services that receive the data from apps and websites could also take a greater role in ensuring that the data sent to them was done so in accordance with relevant policies and regulations, cutting off service to those in violation (again, accompanied by detailed remediation instructions).
 
 Based on these observations, I propose a new privacy enforcement framework. In this framework, compliance burdens are shifted away from the numerous individual online services to the fewer bigger players who are best positioned to actually comply: platforms and data recipients. The FTC’s limited resources can then be focused on the fewer entities at the top of the data food chain (i.e., the platforms and data recipients), whereas enforcement targeted at the many individual online services could be left to a novel mechanism that uses a private right of action to foster much more robust industry self-regulation.