Woodrow Hartzog, Scott Jordan, David Choffnes, Athina Markopoulou, Zubair Shafiq

Regulatory agencies like the FTC have limited resources to hold tech companies accountable for privacy violations. There are simply too many companies, making too many promises, collecting too much data, and in ways that are not transparent. But help is available. Over the past decade, privacy researchers have made personal data flows more transparent and, in the process, hold tech companies accountable for their actions that threaten our privacy. 
 
 In this essay, we explore how researchers can support the enforcement of privacy laws by some combination of (i) surfacing a company’s privacy representations and statements, (ii) measuring the actual behavior of a company’s systems with respect to their algorithms, user interfaces and processing of data; (iii) going public with their results, working closely with enforcement agencies, and helping file class action suits. 
 
 Our essay proceeds in three parts. First, we describe how researchers can help surface a company’s privacy representations, such as in public, directly to data subjects, or in privacy policies. Regulators require companies to be honest, so more statements on the record means more opportunities for accountability. Researchers improved the automation of the collection and analysis of those statements using natural language processing (NLP). They have also leveraged data subject access rights (DSARs) to make organizational practices more transparent. Work that was traditionally done by privacy experts can now be partially automated and done at scale, which makes this approach powerful.
 
 Second, we explore ways for researchers to discover how organizations process data, which can reveal whether companies keep their promises or act dangerously. Our work has revealed companies that surreptitiously record your screen while you use an app, track your precise geolocation hundreds of times per day, and collect data about sensitive matters like pregnancy and sexual orientation. We found connected devices with always-on microphones that record their environment when they shouldn’t, use voice interactions to target advertising, and continue to track users even when they try to opt out. We find that actual practices are often inconsistent with companies’ official statements and personal information is often used for non-functional purposes. These findings can assist enforcement agencies by providing tools and evidence of violations of data minimization or duty of loyalty requirements (e.g., in CCPA, ADPPA) by detecting whether personal information is required to provide functionality of a service or app. These tools developed for measurement can also be used to detect and block undesirable data flows.
 
 Finally, we detail how researchers can use their tools and findings to help with at-scale enforcement. Our team has exercised individual rights, brought lawsuits in their capacity as citizens, assisted in class action lawsuits, and gone public with our findings. We have worked with regulators to report violations, worked with journalists to broadcast violations, worked with app stores to prohibit privacy violations, and provided tools for developers to understand their own and third-party software. We conclude with recommendations on how policymakers and enforcement agencies can leverage help from the privacy research community to better hold tech companies accountable.