top of page
  • Ashley Baker

CFJ Letter to Senate Judiciary: When regulating data privacy, don't make America more like Europe

Dear Chairman Graham and Ranking Member Feinstein,

We write to you regarding your March 12 hearing “GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation.” We, the president and public policy director of the Committee for Justice (CFJ), are concerned that regulatory frameworks that attempt to address privacy concerns by relying on broad one-size-fits-all rules will inevitably hurt consumers. We believe it is vital to prioritize our nation's economic prosperity and preserve America's role as the leader in technological innovation by learning from the disastrous results of privacy regulations abroad.

Founded in 2002, CFJ is a nonprofit legal and policy organization that promotes and educates the public and policymakers about the rule of law and the benefits of constitutionally limited government. Consistent with this mission, CFJ advocates in Congress, the courts, and the news media about a variety of law and technology issues, encompassing administrative law and regulatory reform, online free speech, antitrust law, and data privacy.

Additionally, CFJ has a long history of leadership on the issue of federal judicial nominations and the confirmation process in the Senate. That includes highlighting how issues at the intersection of law and technology will be impacted by judicial appointments. For example, CFJ submitted a letter to the Senate Judiciary Committee explaining why the confirmation of Supreme Court Justice Brett Kavanaugh would be good for technological innovation and the economic growth it spurs.[1]

In recent years, CFJ has actively advocated for digital privacy protection in Congress and the federal courts.[2] Our focus is on innovation, free speech, and economic growth. We believe that restrictive new requirements or penalties for data collection and use are not only unwarranted but would also threaten the online ecosystem that has transformed our daily lives in the last few decades.

Last year, CFJ responded to the Federal Trade Commission’s Request for Comments on Competition and Consumer Privacy in the 21st Century, as well as the National Telecommunications and Information Administration’s (NTIA) Request for Comments on Developing the Administration’s Approach to Consumer Privacy.[3] Our letter today is adapted from these comments.


Many of the recent privacy proposals wouldn’t protect consumers, but would make America more like Europe. The United States’ economic growth and status as a global leader in innovation will depend on a thorough evaluation of economic risks when crafting our nation’s approach to consumer privacy. As calls for data privacy in the United States echo those heard in Europe, it is important to remember the fate of the European Union’s digital economy at the hands of a strict regulatory regime. We should learn from their mistakes.

Broad, one-size-fits-all privacy rules would have negative consequences for every sector that makes use of data and the ripple effect would be felt across the entire economy.[4] These impacts are already being felt in Europe as a result of the EU’s implementation of the GDPR.[5] An earlier report commissioned by the U.S. Chamber of Commerce argues that the negative impact on the EU GDP could reach -0.8% to -1.3%. The end result would be a direct negative welfare effect on four-person households of about $1,353 per year.[6]

Data protection policies such as “opt-in rules” could deter venture capital investments and strangle U.S.-based tech start-ups. These effects are not merely hypothetical. For example, following the implementation of the opt-in model mandated in the EU’s Privacy and Electronic Communications Directive (2002/58/EC), online ads became 65 percent less effective.[7] This is also one of the reasons for the absence of tech startups in Europe.[8] The inability to generate online revenue and to develop new products forms a roadblock for venture capital investments. Opt-in policies are also illogical since the knowledge that privacy settings can be changed acts as a form of affirmative consent.

Data privacy concerns should not be confused with the constitutional right to privacy found in the Third, Fourth, and Fifth Amendments—which protect us from government intrusions—or even the common law and statutory protections available when a private actor coercively violates our privacy. The public debate often conflates the true privacy rights that protect us from involuntary intrusions by the government and private actors with proposed privacy policies affecting the data we voluntarily convey to tech platforms. This conflation has been made worse by the European Union, which has labeled its package of privacy policies as a fundamental right, even though many of those policies are at odds with the free speech and economic rights prized by Americans (for example, see the EU’s “Right to Be Forgotten”). This is a very important distinction to maintain.

Congress should pay particular attention to proposed state regulations that threaten to create a patchwork of regulations that could strangle new businesses and technologies with contradictory laws and enforcement. When faced with compliance and financial burdens, new technology companies—and the tax revenue and job creation they produce—tend to move to favorable regulatory environments. Since technology, by nature, cannot be confined within state borders, companies facing a burdensome patchwork of state regulations are more likely to choose to operate outside of the United States.

When crafting a data protection framework, it is especially important that our government understands the unique features of emerging technologies in order to avoid ill-suited or unnecessary regulations that would impede their adoption. For instance, the protection of privacy in AI systems can be facilitated by the “black box” nature of machine learning combined with careful handling of the training data sets used. If those data sets are properly disposed of once the learning phase is complete, the neural network capture the knowledge they need to perform without preserving any of the individual data that could compromise privacy.

Public debate is disproportionately focused on large companies, but the vast majority of Internet companies fall in the latter category and include the very companies that might otherwise grow to compete with and even supplant the tech giants of today. Sweeping ex ante regulatory approaches like the GDPR, and the recently-passed California Consumer Privacy Act (CCPA), are likely to create an artificial imbalance in the competitive ecosystem in which many firms operate.[9] Unlike their resource-lean startup counterparts, large companies are far better situated to devote labor costs and time to addressing the increased compliance costs necessitated by broad data protection mandates such as the GDPR. This imbalance is likely to result in anticompetitive lock-in effects for incumbent firms.

Public opinion polls showing support for stronger data protections are misleading because they rarely confront consumers with the monetary and other costs of their choices. A 2016 study found that, despite most participants’ unease with an email provider using automated content analysis to provide more targeted advertisements, 65 percent of them were unwilling to pay providers any amount for a privacy-protecting alternative. [10] Such studies remind us that most consumers do not value data privacy enough to pay anything for it.


To fundamentally address the current privacy concerns about the Internet, we really would need to start over from scratch. That's because the privacy problems have their roots in decisions made and directions taken decades ago concerning the Internet's technical structure and the business model that supports most of the enterprises on the world wide web.

When the Internet was conceived and designed 50 years ago, the goal was to make the flow of data easy and virtually indiscriminate in both directions—that is, sending and receiving. The Internet privacy problem arises from the successful achievement of that goal. Contrast that with television and radio, which has a one-way flow, or traditional telephony, in which only a limited amount of information flows back to the service provider.

In the 1990s, when the world wide web emerged and made the Internet a household word, people wondered how the exploding number of websites were going to convert their popularity into profitability and sustainability. The answer turned out to be, for the most part, selling advertising. It was inevitable that web sites would sell their competitive advantage—that is, access to user data—to advertisers, which provided the second necessary component for today's privacy problem. With an open Internet architecture and a business model driven by user data, it was just a matter of time and growth until today's controversies erupted.

That said, it is not feasible to start over from scratch. The open, two-way architecture of the Internet is baked in and it is hard to see how any substantial change would be possible. Business models evolve slowly rather than abruptly, so an end to websites' reliance on user data-driven advertising is not something we'll see in the next decade if ever. With the two big enablers of today's privacy concerns here to stay, if the United States is to continue its role as a leader of technological innovation and enjoy the economic prosperity that it creates, we are stuck with the technological ecosystem that we currently have.

Trying to reinvent the wheel through data privacy regulations would make the United States less great and more like Europe. It is best to proceed with caution and learn from the mistakes and failures of others abroad.


Curt Levey


The Committee for Justice

Ashley Baker

Director of Public Policy

The Committee for Justice

Sources & Notes

[1] Curt Levey and Ashley Baker, Letter to the Senate Judiciary Committee on the Nomination of Brett Kavanaugh to the Supreme Court. 4 Sept. 2018,

[2] See, e.g., amicus briefs filed in Carpenter v. United States. 11 Aug. 2017, and United States v. Kolsuz. 20 March 2017,; The Committee for Justice, Letter to Congress in Support of the Clarifying Lawful Use of Overseas Data (CLOUD) Act. 13 Feb. 2018,

[3] Ashley Baker, Comments submitted to the National Telecommunications and Information Administration in the Matter of: Request for Comments on Developing the Administration’s Approach to Consumer Privacy, Docket Number 180821780-8780-01. 9 Nov. 2018.; Ashley Baker, Comments Submitted to the Federal Trade Commission Regarding Hearings on Competition and Consumer Protection in the 21st Century. 21 Dec. 2018.

[4] Data minimization and purpose-limitation mandates make it far more difficult to transmit information between firms, industries, and national borders. (See, e.g., Sarah Wheaton, "5 BIG Reasons Europe Sucks at Curing Cancer," Politico, 12 Oct. 2018, The GDPR, for example, would have made it impossible for the Danish Cancer Society to conduct the study that helped dispel the myth of a correlation between mobile cellular phone use and cancer. (See Patrizia Frei et al., "Use of Mobile Phones and Risk of Brain Tumours: Update of Danish Cohort Study,” BMJ, 20 Oct. 2011,

[5] Daniel Lyons, “GDPR: Privacy as Europe’s tariff by other means?,” American Enterprise Institute, 3 July 2018,

[6] Matthia Bauer, et. al., “The Economic Importance of Getting Data Protection Right: Protecting Privacy, Transmitting Data, Moving Commerce,” European Centre for International Political Economy, report commissioned by the U.S. Chamber of Commerce, Mar. 2013, p. 3,

[7] Alan McQuinn. "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information Technology and Innovation Foundation. 6 Oct. 2017,

[8] Mark Scott. "For Tech Start-Ups in Europe, an Oceanic Divide in Funding." The New York Times. 19 January 2018.

[9] Susan E. McGregor and Hugo Zylberberg, “Understanding the General Data Protection Regulation: A Primer for Global Publishers,” Tow Center for Digital Journalism at Columbia University (New York, NY: Mar. 2018), pp. 37-38,

[10] Lior Jacob Strahilevitz and Matthew B. Kugler. “Is Privacy Policy Language Irrelevant to Consumers?" The Journal of Legal Studies 45, no. S2. Sept. 9, 2016.


bottom of page