Hardly a day goes by without the Federal Trade Commission announcing plans to clamp down on the tech industry. Its latest foray, a proposal for far-reaching rules to counter the bogeyman of “commercial surveillance,” comes like a great dark cloud: essentially hazy but portentous and sweeping.
While the FTC’s notice doesn’t describe any specific regulations between its sweeping discussion of potential privacy harms and the 95 questions it puts forward for comment, the commission appears to be suggesting stringent limits on common business practices like targeted advertising, automated decision-making, and the use of algorithms that may discriminate based on gender and age, among other things.
Unfortunately, the agency’s absolutist vision of privacy protection ignores the adage that “there ain’t no such thing as a free lunch.” Access to data can be a powerful source of competition and innovation, one that overly broad privacy regulations threaten to disrupt.
The FTC notice discusses at length what it calls “a small fraction” of the harms that might be associated with the commercial collection, use and dissemination of data. But regulating commercial data requires complex tradeoffs between the valuable flow of information in the market and whatever values might be served by restricting companies’ data access and usage. Indeed, there is mounting evidence that many privacy regulations are costly, lead to worse products and services, reduce competition, and lower innovation.
Our best evidence that stringent privacy regulations curb firms’ investments in high-tech industries comes from the European Union, which passed its sweeping General Data Privacy Regulation in 2016. Research shows that the GDPR decreased European venture capital deals by up to 26 percent after its rollout — with consumer-facing services, healthcare and finance being particularly hard hit.
That might be an acceptable tradeoff if there were evidence that such regulations provide tangible benefits to consumers, but the record here is thin. The GDPR did not increase trust in digital data collectors. Likewise, consent boxes prompting users to accept digital cookies, which have become synonymous with the GDPR, don’t appear to help users understand what they are consenting to.
Meanwhile, the FTC overlooks the tremendous benefits that consumers enjoy from data-enhanced services. For example, improved targeting lets platforms display less obtrusive ads while consumer search is far more efficient.
Indeed, search-engine users prefer a mix of targeted ads and organic search results. Recommendation algorithms also increasingly help consumers to identify music, movies and other goods that suit their tastes.
Effective use of consumer data helps generate more ad-supported content, which consumers enjoy for free. All these things would be harder and more costly to obtain in a world of absolutist privacy protection.
Another problem is that excessive privacy protection may stifle competition. There is mounting evidence that Europe’s onerous privacy regulations have served to shield incumbents from smaller competitors. Much has been said about how Amazon uses data to compete against the sellers that use its platform. But whether that’s true or false, what can’t be disputed is that Amazon’s data expertise lets it compete more aggressively against powerful incumbents like Walmart. Likewise, Meta’s use of data collected outside its main Facebook platform improves its ability to compete with Google on ads. These benefits are then passed on to consumers.
More fundamentally, firms using personal data to generate valuable insights is good for society. For instance, allowing pharmaceutical companies to use data from DNA test kits may lead to groundbreaking new treatments. There are tremendous benefits to consumer health and public health from the use of electronic health information.
In the end, the available evidence simply does not support creating the kind of privacy apparatus the FTC wants to deploy. If the commission wants to enact new privacy rules, it needs to do more than recite a litany of hypothetical harms that its regulations might address. It must lay out concrete and workable rules and show how its version of privacy protection would avoid the traps to which other initiatives have fallen prey.