Immense economic pressure could eventually coerce everyone into sharing their information with firms, since not disclosing your data will itself imply that you are undesirable.


          In the internet of things, the Federal Trade Commission sees the possibility of flourishing new markets. But it also sees a prologue to Black Mirror: in a new report that probes the privacy implications of connected devices, the commission surveys a landscape of possible dystopian futures. Get ready for invasive marketing, unending consumer surveillance, invisible nudging, and new potential for government spying and novel forms of hacking.

The report seeks to identify the dangers to consumers presented by the internet of things. How might information gleaned from a car GPS, fitness tracker or smart refrigerator lead to negative effects on your creditworthiness, employability, or insurance premiums? As a prelude to the development of best practices, and perhaps new legislation, the FTC aims to establish industry standards for data gathering and use.


While the FTC does not call for a law specific to networked devices, it does invite Congress to pass broad data-security legislation that would shield consumers, at least in part, from the headline-screeching data breaches like those that recently afflicted Sony, Target, and Home Depot. And it preaches a gospel of data minimization: companies ought to keep as little data as needed, dispose of it when it’s no longer required, and strip identifying information out of it when possible.
As a political document, the FTC’s report has the sterile touch of evenhandedness. In its pursuit to illuminate the social ramifications of car trackers, healthcare wearables, and thinking thermostats, the commission had to simultaneously manage the regulatory anxiety of hardware manufacturers, analytics firms, and insurance providers. And in the Beltway balancing act of public interest on the one hand and capital “I” Innovation on the other, the FTC appears almost too timid to follow through on its own research.

That the pervasive collection of information from within our homes might create automated forms of profiling, discrimination, and exploitation seems, to the commission, merely secondary to reassuring strategic business interests. But the FTC has started a conversation that many privacy experts are eager to continue. Compared to Congress, which has an internet of things hearing scheduled for today — led by Senator John Thune (R-SD) who’s vying to maintain a laissez-faire capitalism approach to data privacy — the FTC will likely remain the stronger advocate for reform.


I spoke with David Rose, CEO of Ditto Labs and a researcher at MIT Media Lab, about the challenges posed by connected objects to consumers and society. Rather than focus on insidious government supervision — of the thought-policing Snowden variety — Rose emphasized the growing powers of what we might call corporate surveillance. Connected devices enable corporations to amass evermore granular intelligence about customers to better predict our shopping habits. According to Rose, even as companies like Acxiom and Epsilon have been compiling data address by address on every family in America with thousands of fields for every household, the unexpected uses of consumer data are evolving. And they’re doing so in ways that aren’t clear to the public.

Rose explained the business potential of tailored advertising and marketing inference using his own company’s work. Ditto scans publically available photographs on social media. And based on who’s driving a Jeep, clutching Prada, slugging a Red Bull or sporting Patriots gear he’s able to draw conclusions on what consumers prefer and what purchases they might be “susceptible” to in the near future. “We call it affinity data,” he said.


Pam Dixon, the executive director of the World Privacy Forum, is concerned less with marketing novelties than with other unsettling uses of data harvesting. “I think we have to be very conscious about focusing on advertising as the horrifying thing here,” she told me. “It’s not the fact that someone sees an ad for a shoe. The problem is the secondary uses by data brokers and in predictive analytics.”

As a broad example, Dixon described a situation where, in exchange for a premium discount, an insurance provider might require customers to wear an activity tracker in their house, perhaps counting their steps, measuring their stress, or recording the things that they eat. For individuals with chronic diseases, who require wheelchairs, have PTSD, or were born with genetic disorders, this paradigm swallows people up and spits out systemic bias. The model assumes that “everyone is young and fit, or can become that way,” she said.

It’s also unclear what boundaries exist for employers, insurance companies and law enforcement to act upon personal data gathered by sensor-based devices. Questions like who owns the data, where it is stored, if it’s encrypted, how it can be used, and whether the data can be repackaged and sold for purposes not expected by consumers remain unanswered.

“The FTC, I think, would like to see legislation about this,” Dixon said. “There’s just no roadmap that says here’s how we do that in the internet of things. There’s just none,” she said.


Despite these criticisms, certain consumers stand to gain by allowing remote monitoring of their vehicles, homes, and bodies. Scott Peppet, a professor at the University of Colorado School of Law, has studied the economic incentives that propel the growing trend of voluntary disclosures. It’s a trend the internet of things is likely to accelerate.

Until recently, firms have mined publicly available data to create rough profiles of consumers based on their purchases, credit history, driving record, and other factors. With its ubiquitous data-reaping technology, the internet of things allows firms to simply verify information with our permission. Companies get real-time access to how fast we drive, how often we brake, and how quickly we turn. Good drivers will get cheaper car insurance. Bad drivers will likely pay more.

“Simple self-interest will drive self-disclosure by those with favorable private information,” Peppet writes. The healthiest and wealthiest among us, those with the strongest credit lines and best reputations, will wish to signal their vitality and income to the insurers, banks, and retailers that will trade preferential treatment for verified consumer data.


But this “opt-in model” only looks like consumer choice, Peppet argues. In fact, he says, it’s a mirage. Immense economic pressure could eventually coerce everyone into sharing their information with firms, since not disclosing your data will itself imply that you are undesirable. Or that you carry a high burden of risk. The economic punishment of non-disclosure, Peppet suggests, will be worse than actually sharing your Chipotle fixation. “Eventually, even those with the worst private information may realize that they have little choice but to disclose to avoid the stigma of keeping information secret,” he writes. In Peppet’s future of full disclosure, privacy will become unraveled because opting out will be too costly. In order to compete for work, to become eligible for attractive loans and to qualify for affordable insurance, we’ll have to pay the price of our personal data. To participate, to make our lives fully visible to corporate actors, will be the only real option.