CFJ Files Comments with NTIA on Developing the Administration’s Approach to Consumer Privacycmteforjustice
top of page
  • Ashley N. Baker

CFJ Files Comments with NTIA on Developing the Administration’s Approach to Consumer Privacy

On Friday, November 9, the Committee for Justice (CFJ) responded to the National Telecommunications and Information Administration’s (NTIA) Request for Comments on Developing the Administration’s Approach to Consumer Privacy. CFJ's comments emphasize the need to prioritize economic prosperity and preserve the United States' role as leader in technological innovation by learning from the disastrous results of privacy regulations abroad. Our comments can be downloaded here.

Key recommendations include:

  • Many of the recent privacy proposals wouldn’t protect consumers, but would make America more like Europe. The United States’ economic growth and status as a global leader in innovation will depend on a thorough evaluation of risks when crafting our nation’s approach to consumer privacy. As calls for data privacy in the United States echo those heard in Europe, it is important to remember the fate of the European Union’s digital economy at the hands of a strict regulatory regime. We should learn from their mistakes.

  • Data protection policies could deter venture capital investments and strangle U.S.-based tech start-ups. The EU’s Directive 2002/58/EC, which mandated an opt-in policy to obtain affirmative consent, is an unfortunate example of this. Additionally, as a result of these measures, small companies have less money to invest in research and development for new products and services and may even shut down. Opt-in policies are also illogical since the knowledge that privacy settings can be changed acts as a form of affirmative consent.

  • Data privacy concerns should not be confused with the constitutional right to privacy found in the Third, Fourth, and Fifth Amendments—which protect us from government intrusions—or even the common law and statutory protections available when a private actor coercively violates our privacy. The public debate often conflates the true privacy rights that protect us from involuntary intrusions by the government and private actors with proposed privacy policies affecting the data we voluntarily convey to tech platforms. This conflation has been made worse by the European Union, which has labeled its package of privacy policies as a fundamental right, even though many of those policies are at odds with the free speech and economic rights prized by Americans (for example, see the EU’s “Right to Be Forgotten”). This is a very important distinction to maintain.

  • The Federal Trade Commission (FTC) already has the appropriate statutory authority to protect consumer privacy. The FTC should continue its role as the safeguard against unscrupulous data practices. Rushed attempts to implement a federal privacy policy are unnecessary since the FTC has proven to be an effective policeman. As for changes with regard to process, it could be helpful for the FTC to develop guidelines to determine the need to bring an enforcement action, especially as the data ecosystem expands with the Internet of Things (IoT). However, this should only be done after the careful evaluation of public input.

  • The Administration should pay particular attention to proposed state regulations that threaten to create a patchwork of regulations that could strangle new businesses and technologies with contradictory laws and enforcement. When faced with compliance and financial burdens, new technology companies—and the tax revenue and job creation they produce—tend to move to favorable regulatory environments. Since technology, by nature, cannot be confined within state borders, these companies are more likely to choose to operate outside of the United States.

  • When crafting a data protection framework, it is especially important that our government has an understanding of the unique features of emerging technologies in order to avoid ill-suited or unnecessary regulations that would impede their adoption. For instance, the protection of privacy in AI systems can be facilitated by the “black box” nature of machine learning combined with careful handling of the training data sets used. If those data sets are properly disposed of once the learning phase is complete, the neural network capture the knowledge they need to perform without preserving any of the individual data that could compromise privacy.

 

bottom of page
Mastodon