AUGUST 29, 2013
Big data companies beware: the FTC plans to boost
enforcement in your industry
By Linn Foster Freedman and Kathryn M. Sylvia
On August 19, 2013, at the Technology Policy Institute’s Annual Conference in Aspen, Colorado, the Federal Trade Commission’s (FTC) Chairwoman Edith Ramirez warned companies that have access to “big data” to comply with data privacy and security regulations when collecting and
disseminating consumer data. In her “View from the Lifeguard’s Chair,” Ramirez said, “The [FTC] is like the lifeguard on a beach. Like a vigilant lifeguard, the FTC’s job is not to spoil anyone’s fun but to make sure that no one gets hurt.”
The FTC regulates big data companies for “unfair and deceptive acts or practices” and the industry must also comply with sector specific privacy laws, under which the FTC can regulate companies that collect information regarding consumers’ credit or from children under the age of 13. The FTC understand that big data does not flow from “the province of a few giant companies” only. Ramirez defines “big data” as “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze,” saying that big data will soon become “a tool available to all sectors of the economy”—sectors that will see continued and growing FTC enforcement.
All companies collecting, storing, or manipulating large data sets should pay close attention to FTC guidance, including the “hazards” and recommended principles below.
Hazards of big data
Ramirez identified the “hazards” of big data as follows: 1. “Indiscriminate collection of data”
Ramirez cautioned that the indiscriminate collection of expansive data “violates the First Commandment of data hygiene: Thou shall not collect and hold onto personal information unnecessary to an identified purpose.” She pointed out “old data is of little value” and as such big data moguls should assess the value of maintaining old data that may lead to unreliable information.
2. “The need to ensure meaningful consumer choice”
Some big data companies seek to expand data collection limits while focusing on “after-the-fact use restrictions” instead. Ramirez fears that this contention “stands privacy protection on its head.” The collection of personal information should be a simple choice for
consumers. If the collection of personal information is “inconsistent with the context of the transaction or the company’s relationship with the consumer” the information should not be collected. Ramirez cited her additional concerns about consumer choice because consumers are “rarely, if ever” provided an opportunity to consent to the aggregation of their personal information or the secondary uses of their information. As such, use restrictions cannot be the only privacy protection. Ramirez claims that “information that is not collected in the first place can’t be misused. And enforcement of use restrictions provides little solace to
consumers whose personal information has been improperly revealed. There’s no putting the genie back in the bottle.”
3. “Data breach”
This large concentration of big data poses a large risk of data breach. Ramirez says that “the risk of consumer injury increases as the volume and sensitivity of the data grows.” Her analogy: “If water in a glass spills, the consequences generally are manageable. One wipes up the water and moves on. But if one builds a dam to store tremendous volumes of water and the dam breaks, the consequences can be quite serious.” Big data companies must be
responsible. These companies must act as “responsible stewards” to protect and preserve the privacy of consumer data. While the FTC does enforce § 5 of the FTC Act, the FTC seeks to add civil penalty authority against the big data companies that fail to protect the data with adequate security and safeguards.
4. “Behind-the-scenes profiling”
Data brokers that collect and aggregate consumer’s information create extensive profiles of these individual consumers, including sensitive information. Consumers are provided no information on these data brokerages and these brokerages’ practices remain unknown by the consumer. The FTC issued 6(b) Orders to nine data brokers last year to learn “about the nature and sources” of the consumer information collected, “how they use, maintain[,] and disseminate the information,” and the “extent to which they allow consumers to access and correct their information or opt out of having their personal information sold.” The FTC will issue a report before the year end detailing their findings. Big data companies should be sure to review this report.
5. “Data determinism”
Big data companies can use the plethora of data collected to make predictions and
assumptions about individual consumers, “not based on concrete facts, but on inferences or correlations that may be unwarranted.” While this may eliminate misdirected advertising and make shopping online more convenient for consumers, these predictions drawn from algorithms can suggest an individual is a “poor credit or insurance risk, [or] [an] unsuitable candidate for employment or admission to schools or other institutions.” While the companies who rely on this data may not see a problem with incorrect predictions or assumptions of a minimal nature, the consumer who has been mis-categorized, “may feel like [the victim of] arbitrariness-by-algorithm.” Ramirez does not suggest that the potential for mistakes means that we should not collect data; instead she demands “transparency, meaningful oversight[,] and procedures to remediate decisions that adversely affect
“Privacy-by-design” principles applied to big data
Ramirez offered advice to big data companies: follow the three core principles for data protection practices from the FTC’s 2012 Privacy Report:
Build privacy into products and services during the development phase. Big data companies should conduct risk assessments to determine vulnerabilities and map the data they collect, use, and maintain to assure that appropriate security protections and safeguards are built-in from the very first prototype. If big data companies avoid the security mishaps, the need to mitigate will not exist.
2. “Simplified choice”
Tell the consumer what data is being collected, who is collecting it, and how the data is being used. From the consumer’s very first interaction with the collection of data, the consumer should have the opportunity to choose what, who, and how.
3. “Greater transparency”
“Clear the air” as Ramirez suggests. Tell the consumers exactly what information will be collected and exactly how it will be used. Ramirez says, “Transparency is the key to accountability, the key to responsible data collection and use, and the key to building consumer trust.”
FTC’s enforcement rights and enforcement actions
Under Section 5 of the FTC Act, the FTC has the ability to regulate these big data companies for “unfair and deceptive acts or practices.” As such, big data companies should be sure their privacy and security practices are transparent and that their policies and procedures provide adequate protections. Additionally, there are applicable sector specific privacy laws including the Fair Credit Reporting Act and the Children’s Online Privacy Protection Act (COPPA), under which the FTC can regulate big data companies that collect information regarding consumers’ credit or from children under the age of 13.
In the recent past, the FTC has brought enforcement actions alleging that Google, Inc. engaged in unfair competition practices; Facebook, Inc. engaged in deceptive account privacy practices; and MySpace, LLC made misleading claims about users’ privacy protections. The FTC recently brought an enforcement action alleging that Wyndham Hotels and Resorts, LLC had faulty security practices that led to three data breaches in an 18-month period. There have also been actions alleging that LexisNexis (as a division of Reed Elsevier Inc.) and Twitter employed failed security practices and procedures for protecting consumer information, and ChoicePoint, Inc. for a consumer data breach. Additionally, the FTC issued subpoenas to nine data brokers last year, including Acxiom, Corelogic, Datalogix, eBureau, ID Analytics, Intelius, Peekyou, Rapleaf, and Recorded Future to determine how these brokers collect and use consumer data. The FTC is expected to release a report on its findings before the end of the year.
It is clear that big data companies will see an increase in the number of these types of enforcement actions.
The influx of mass data collection and dissemination raises privacy concerns. While many uses of big data bring benefits that do not individually identify consumers, such as, says FTC Chairwoman Ramirez, “forecasting weather and stock and commodity prices; upgrading network security systems, and improving manufacturing supply chains,” there are many other uses that directly identify
consumers and threaten the consumer’s personal privacy. Ramirez expressed that
[t]here is little doubt that the skyrocketing ability of business to store and analyze vast quantities of data will bring about seismic changes in ways that may be unimaginable today. Unlocking the potential of big data can, for instance, improve the quality of health care while cutting costs. It can enable forecasters to make increasingly precise predictions about weather, crop yields, and spikes in the consumption of electricity. And big data can improve industrial efficiency, helping to deliver better products and services to consumers at lower costs. (fromhttp://www.ftc.gov/speeches/ramirez/130819bigdataaspen.pdf) However, Ramirez’s concern lies in the big data that identifies an individual’s health history, website browsing history, purchasing habits, financial information, and social, religious, and political
Ramirez stated that while the capabilities of these big data companies are “transformative,” the challenges are not “novel or beyond the ability of our legal institutions to respond.” With rapid increase in technology and expansive data collection, the FTC can still regulate these businesses and tame the field.
Ramirez further pointed out that “addressing the privacy challenges of big data is first and foremost the responsibility of those collecting and using consumer information,” and that “the time has come for businesses to move their data collection and use practices out of the shadows and into the sunlight.”
Ramirez expressed her desire to remove the division between the FTC and the Federal Communications Commission (“FCC”) when it comes to the “common carriers” exception. Currently, the FCC has jurisdiction over mobile carriers, and the FTC has jurisdiction over all other players except the common carriers. Ramirez states that the FTC “has long urged Congress to rectify this ancient allocation” and it must be addressed again with the great rise in mobile technology and data privacy concerns. Not only will the FTC increase its scrutiny and enforcement against big data companies that already fall under its jurisdiction, but it may soon expand its umbrella and enforce actions against mobile carriers who collect and disseminate big data.
Ramirez concluded her discussion: “Lifeguards have to be mindful not just of the people swimming, surfing, and playing in the sand. They also have to be alert to approaching storms, tidal patterns, and shifts in the ocean’s current. With consumer privacy, the FTC is doing just that—we are alert to the risks but confident those risks can be managed.” While the FTC encourages innovation and
technological advancement, “like the lifeguard at the beach,” the FTC “will remain vigilant to ensure that while innovation pushes forward, consumer privacy is not engulfed by that wave.” Big data companies should heed this warning and take a look at their privacy and security practices to ensure that they are “playing fair” to avoid the FTC’s whistle.
So, what does this mean for companies collecting, maintaining, using, and sharing data? Data privacy and security and transparency should be high on your radar screen. An Enterprise Data Privacy and
If you need assistance with developing or implementing your Data Privacy and Security program, please contact:
Linn Foster Freedman, Privacy & Data Protection Group Leader, at
email@example.com or (401) 454-1108