☲☰CRYPTED
Editor’s Note: For most of us, the wide world of technology is a wormhole of dubious trends with a side of jargon soup. If it’s not a bombardment of startups and tech trends (minimum viable product, Big Data, billion dollar IPO!) then it’s unrelenting feature mongering (Smart Everything! Siri!). What’s a level-headed guy with a few bucks in his pocket supposed to do? We’ve got an answer, and it’s not a ⌘+Option+Esc. Welcome to Decrypted, a new weekly commentary about tech’s place in the real world. We’ll spend some weeks demystifying and others criticizing, but it’ll all be in plain english. So take off your headphones, settle in for something longer than 140 characters and prepare to wise up.
Last month New York Times writer Ron Lieber wrote on his experience allowing State Farm to track his driving as part of their usage-based insurance policy. These types of systems are in their infancy, but they allow drivers to lower their insurance premiums based on safe driving, as determined by data points like acceleration, velocity and g-forces during turns. “For me, it turned driving into a game that could yield real money through safer behavior,” wrote Lieber.
Driving data is one part of the new, so-called “quantified self”, in which car sensors, home thermostats and omnipresent on-person devices gather objective information to create a viewable, digital portrait of someone’s life. Data collection is quickly becoming popular due to the effectiveness and objectivity of using data to adjust premiums, target advertisements and generally operate a consumer-facing business better. The immediate incentives for early stage adopters like Lieber of any tracking device are clear: understanding and motivation. In the case of auto insurance, this digital portrait has so far led “participants in the program [to] get an average of 10 to 15 percent off their premium”. But for consumers, there are also troubling implications looming concerning how a person’s digital portrait can be used and the security of important data.
The worries aren’t exactly new. At the advent of popular location services such as FourSquare, the excitement users felt seeing where they’d been and their ability to keep tabs on their friends was a draw soon tinged by the risks of oversharing (which can be quickly summed up by the aptly named Please Rob Me, a site that serves to show people how information they post online can be used against them). Currently, in an exact parallel, fitness trackers are gaining popularity because of their ability to help users visualize and track their fitness and compare themselves to their friends, not to mention the motivation inherent in having every one of their steps counted (doubters need to look no further than David Sedaris’s experiences). While the danger in the first instance of being “located” appeals readily to a person’s hard-wired sense of caution, the potential dangers inherent in the recent rise of tracking health data using wearable devices and smartphone apps is more real, and much more insidious.
The potential dangers inherent in the recent rise of tracking health data using wearable devices and smartphone apps is real, and insidious.
To see these dangers one can look again to the car insurance example. Currently, according to Lieber, there’s no penalty for dangerous driving for those who opt into the program, only discounts for safe driving. But as more drivers agree to share their data, there will be a built-in relative cost for “private” driving, as rates will remain constant for holdouts. There’s also the potential that premiums for holdouts will rise if these drivers share similar characteristics (age, race, income level, location) with other drivers who have installed sensors into their cars and proven themselves risky drivers.
This same logic could ostensibly be carried over into other, more personal areas. Wearable companies like Fitbit and Jawbone have been an increasingly popular choice in capturing data for large shares of health-conscious consumers. But what they do with that data is up to them. In a series on privacy concerns held last spring, the FTC found that the “12 [health] apps tested transmitted information to 76 different third-parties”, including consumer health metrics along with identifying characteristics. These third-parties include data brokers, who keep tabs on millions of Americans.