Lead generators, beware.
The issue of legitimate data use, reasonable expectations, disclosures, affirmative consent, data privacy compliance and lead generation best practices are still front-and-center at the Federal Trade Commission.
Among the most sensitive categories of data collected by connected devices are a person’s precise location and information about their health.
Smartphones, connected cars, wearable fitness trackers, “smart home” products and even the browser you are reading this on are capable of directly observing or deriving sensitive information about users. The Federal Trade Commission oftentimes considers the collection and combination of that data via connected devices and technology companies, followed by the monetization thereof, to be an unprecedented intrusion.
According to The conversation about technology tends to focus on benefits. But there is a behind-the-scenes irony that needs to be examined in the open: the extent to which highly personal information that people choose not to disclose even to family, friends, or colleagues is actually shared with complete strangers. These strangers participate in the often shadowy ad tech and data broker ecosystem where companies have a profit motive to share data at an unprecedented scale and granularity.
Unbeknownst to many consumers, when connected devices are used (or not), they may be pinging cell towers, interacting with WiFi networks, capturing GPS signals and otherwise creating a comprehensive record of their whereabouts.
The FTC is aware that this location data can reveal a lot about people, including where they work, sleep, socialize, worship and seek medical treatment. While many consumers may happily offer their location data in exchange for real-time crowd-sourced advice on the fastest route home, the FTC has expressed concern that consumers may think differently about having their online identity associated with the frequency of their visits to a therapist or cancer doctor.
Beyond location information generated automatically by consumers’ connected devices, millions of people also actively generate their own sensitive data, including by using apps to test their blood sugar, record their sleep patterns, monitor their blood pressure, or track their fitness, or sharing face and other biometric information to use app or device features. The potent combination of location data and user-generated health data creates a new frontier of potential harms to consumers, and of the bases for regulatory enforcement and investigation.
Federal (and state) regulators have repeatedly expressed concern that the marketplace for this information is opaque and that once a company has collected it, consumers often have no idea who has it or what is being done with it. After it is collected from a consumer, does the data enter a vast and intricate sales floor frequented by numerous buyers, sellers and sharers?
The FTC has always considered the data aggregation and brokerage marketplaces – where information is collected from multiple sources and sold to marketers, and other third-parties – to be somewhat murky. According to the FTC, “these companies often build profiles about consumers and draw inferences about them based on the places they have visited. The amount of information they collect is staggering. For example, in a 2014 study, the FTC reported that data brokers use data to make sensitive inferences …”
According to foregoing report, one data broker bragged to shareholders in a 2013 annual report that it had 3,000 points of data for nearly every consumer in the United States. In many instances, data aggregators and brokers have no interaction with consumers or the apps they are using. The FTC has expressed concern that consumers are left in the dark about how companies are profiting from their personal information.
Realize that regulators consider location and health information to be particularly sensitive and the misuse of data is something that should be carefully considered with a qualified FTC compliance attorney.
The concerns federal and state regulators have expressed about the risk of misuse and resulting enforcement action has been far more than just theoretical.
In 2017, for example, the Massachusetts Attorney General reached a settlement with marketing company Copley Advertising, LLC, and its principal for allegedly using location technology to identify when people crossed a secret digital “fence” near a clinic offering abortion services. Based on that data, the company allegedly sent targeted ads to their phones with links to websites with information about alternatives to abortion.
The Massachusetts AG asserted that the practice violated state consumer protection law.
Recently, the FTC reached a settlement with Flo Health, alleging the company shared with third-parties – including Google and Facebook – sensitive health information about women collected from its period and fertility-tracking app, despite promising to keep this information private.
The misuse of mobile location and health information exposes marketers to significant corporate and personal liability. Additionally, according to an FTC attorney, “criminals can use location or health data to facilitate phishing scams or commit identity theft. Stalkers and other criminals can use location or health data to inflict physical and emotional injury. The exposure of health information and medical conditions, especially data related to sexual activity or reproductive health, may subject people to discrimination, stigma, mental anguish, or other serious harms. Those are just a few of the potential injuries – harms that are exacerbated by the exploitation of information gleaned through commercial surveillance.”
Unequivocally, the FTC has stated that it is committed to using the full scope of its legal authorities to protect consumers’ privacy. “We will vigorously enforce the law if we uncover illegal conduct that exploits Americans’ location, health, or other sensitive data. The FTC’s past enforcement actions provide a roadmap for firms seeking to comply with the law.”
What should companies consider when thinking about the collection of confidential consumer information, including location and health data?
To begin with, understand that sensitive data is protected by numerous federal and state laws. There are numerous state and federal laws that govern the collection, use, and sharing of sensitive consumer data, including many enforced by the FTC. Consult with an FTC lawyer.
The FTC has brought hundreds of cases to protect the security and privacy of consumers’ personal information, some of which have included substantial civil penalties. In addition to Section 5 of the FTC Act, which broadly prohibits unfair and deceptive trade practices, the Commission also enforces the Safeguards Rule, the Health Breach Notification Rule, and the Children’s Online Privacy Protection Rule.
“Claims that data is “anonymous” or “has been anonymized” are often deceptive. Companies may try to placate consumers’ privacy concerns by claiming they anonymize or aggregate data. Firms making claims about anonymization should be on guard that these claims can be a deceptive trade practice and violate the FTC Act when untrue.”
According to the FTC, significant research has shown that “anonymized” data can often be re-identified, especially in the context of location data. One set of researchers demonstrated that, in some instances, it was possible to uniquely identify 95% of a dataset of 1.5 million individuals using four location points with timestamps. Companies that make false claims about anonymization can expect to hear from the FTC.
The FTC cracks down on companies that misuse consumers’ data, and has done so for many years, particularly in the consumer credit and other high-risk marketing verticals.
As recent cases have shown, the FTC does not tolerate companies that over-collect, indefinitely retain, or misuse consumer data. Ad exchange OpenX recently paid $2 million for collecting children’s location data without parental consent. The Commission also took action against Kurbo/Weight Watchers for, among other things, indefinitely retaining sensitive consumer data. The settlement requires the company to pay a $1.5 million fine for allegedly violating COPPA, delete all illegally collected data, and also delete any work product algorithms created using that data.
Most recently, the FTC entered a final order requiring CafePress to pay redress and minimize its data collection because, according to the Commission’s complaint, it improperly collected and retained consumer data, and failed to respect consumers’ deletion requests, among other things.
The foregoing should be of interest to lead generators and digital marketers. Contact an experienced FTC lawyer if you are interested in discussing substantive legal regulatory compliance considerations.
Richard B. Newman is an FTC defense lawyer at Hinch Newman LLP.
Informational purposes only. Not legal advice. May be considered attorney advertising.