Home – Surf’s Up! FTC Proclaims Lifeguard Role in Monitoring Big Data

Surf’s Up! FTC Proclaims Lifeguard Role in Monitoring Big Data



 Corporate counsel can glean much about the government’s views on the regulation of big data from a speech delivered by Edith Ramirez, the Chairwoman of the Federal Trade Commission.  Delivering a keynote address on Aug. 19, 2013, to the Technology Policy Institute’s Aspen Forum, held in Aspen, Colorado, she provided her vision for the FTC’s role and obligations—as well as the responsibility of big data collectors.  Below are excerpts from her prepared text. 


Here is some of what Chairwoman Ramirez had to say:


Addressing the privacy challenges of big data is first and foremost the responsibility of those collecting and using consumer information. The time has come for businesses to move their data collection and use practices out of the shadows and into the sunlight.


The FTC has used its unfairness authority against companies that fail to provide reasonable data security.  Last year, we sued the Wyndham® Hotel chain for poor data security practices that led to three data breaches in an 18-month period. Over a half million credit card files ended up in the hands of an identity-theft ring operating through domains registered in Russia. 


All told, the FTC has brought over 40 data security cases under our unfairness and deception authority . . . for failing to provide reasonable security safeguards.


Like a vigilant lifeguard, the FTC’s job is not to spoil anyone’s fun but to make sure that no one gets hurt. With big data, the FTC’s job is to get out of the way of innovation while making sure that consumer privacy is respected.


This phenomenal growth in storage and analytic power means that big data is no longer

the province of a few giant companies, like large data brokers, banks, insurers, and health care providers. Big data is now, or soon will become, a tool available to all sectors of the economy.


[M]any uses of big data bring tangible benefits to consumers and businesses alike. And many uses of big data raise no threats to consumer privacy.  On the other hand, many firms use big data in ways that implicate individual privacy.  That data may reflect an individual’s health concerns, browsing history, purchasing habits, social, religious and political preferences, financial data and more. They may do so in the service of innovation and efficiencies that confer substantial benefits on consumers.  …the FTC’s role isn’t to stand in the way of innovation; it is to ensure that these advances are accompanied by sufficiently rigorous privacy safeguards.


Hazards We Must Avoid


Indiscriminate Collection of Data—One risk is that the lure of “big data” leads to the indiscriminate collection of personal information. The indiscriminate collection of data violates the First Commandment of data hygiene: Thou shall not collect and hold onto personal information unnecessary to an identified purpose.  Keeping data on the off chance that it might prove useful is not consistent with privacy best practices.


The Need to Ensure Meaningful Consumer Choice—A related concern is that some big data advocates insist that, because more data is always better, and because providing consumer choice may be especially challenging when it comes to big data, the time has come to reconsider limits on data collection. They contend that, to the extent that privacy protection is needed, the focus should be on after-the-fact use [RE1] restrictions, not on limiting collection. Personal information should not be collected against consumers’ wishes.  Business leaders understand this. They know that the quickest way to squander consumer trust is to go behind consumers’ backs when collecting and using personal data.


Adding to the concerns is the reality that some collection of personal data is not, in fact, authorized by consumers. Consumers do, of course, often decide to share personal data. They post personal information on social networks. And in many instances they consent to the collection of personal data by businesses and service providers. But that consent is generally limited to the transaction at hand… .  Rarely, if ever, are consumers given a say about the aggregation of their personal data or secondary uses that are not even contemplated when their data is first collected.


Information that is not collected in the first place can’t be misused. And enforcement of use restrictions provides little solace to consumers whose personal information has been improperly revealed. There’s no putting the genie back in the bottle.  Use restrictions are also hard to enforce. It is often difficult to identify the culprit when data is misused.


Data Breach—The risk of a data breach is also not trivial. After all, the larger the concentration of sensitive personal data, the more attractive a database is to criminals, both inside and outside a firm. And the risk of consumer injury increases as the volume and sensitivity of the data grows.  Firms that acquire and maintain large sets of consumer data must be responsible stewards of that information.   But stronger incentives to push firms to safeguard big data must be in place. The FTC has urged Congress to give the agency civil penalty authority against companies that fail to maintain reasonable security.


Behind-the-Scenes Profiling—Firms of all sorts are using consumer data in ways that may not just be contrary to consumers’ expectation, but could also be harmful to their interests. This problem is perhaps seen most acutely with data brokers—companies that collect and aggregate consumer information from a wide array of sources to create detailed profiles of individuals. The risk of improper disclosure of sensitive information is heightened because consumers know nothing about these companies and their practices are invisible to consumers.


Data Determinism—The involuntary revelation of sensitive personal information is an important concern but it is a risk that predates big data and is inherent in the collection and use of personal information. There is another risk that is a by-product of big data analytics, namely, that big data will be used to make determinations about individuals, not based on concrete facts, but on inferences or correlations that may be unwarranted.


The law of big numbers means that, if the algorithm is correct, in general the companies

will be making the right call. An error rate of one-in-ten, or one-in-a-hundred, may be tolerable to the company. To the consumer who has been mis-categorized, however, that categorization may feel like arbitrariness-by-algorithm.


Steps Businesses Can Take to Harness the Power of Big Data While Safeguarding Consumers


Privacy by Design—Privacy by design means building privacy in as products and services are being developed.


Simplified Choice—We also need to go back to first principles and take consumer choice seriously. Too often the “notice” part of this process is overlooked, even though it is a prerequisite to meaningful choice. Consumers must be told who is collecting their data and what the data will be used for.  And choice mechanisms must be simple and easy-to-use.


Transparency—But the need for greater transparency is not limited to collection of data about consumers’ online behavior. A recurring theme I have emphasized—and one that runs through the agency’s privacy work—is the need to move commercial data practices into the sunlight. For too long, the way personal information is collected and used has been at best an enigma “enshrouded in considerable smog.” We need to clear the air.


Above are excerpts from the chairwoman’s prepared remarks.  The full text is available at: www.ftc.gov/speeches/ramirez/130819bigdataaspen.pdf

Disclaimer: The views and opinions expressed in this article are those of the individual sources referenced and do not reflect the views, opinions or policies of the organizations the sources represent.