How Big Data Exposes Your Values

Editor's note: The following was written by Kord Davis from a post originally at KordIndex. Kord is the author of a forthcoming book on Ethics of Big Data.  - bg

At AT&T Labs in Florham Park, New Jersey, big data is being used to analyze the traffic and movement patterns of people through data generated by their mobile phones to help improve policymaking and urban and traffic planning. The research team realized they could understand deep patterns of how people moved through urban environments by analyzing the flow of mobile devices from cell tower to cell tower. And they wanted to use those insights to help improve traffic flow and to inform better urban planning, not to improve their marketing.

But, of course, AT&T along with Verizon, Google, Tom Tom, Navteq, and several companies who help retail malls track the traffic patterns of shoppers want very much to use that information to generate new streams of revenue. The question of privacy is top of mind (especially as the distinction between anonymized and personally identifying information becomes more difficult to maintain) but the question of ownership is equally compelling.

Since people buy their mobile devices, does the data generated by the use of those devices belong to the individual device owners—or to the company who owns and maintains the technological infrastructure which makes that usage possible?

The Electronic Frontier Foundation offers an interesting metaphor in response to this question:.

" ‘Big data’ is the mantra right now. Everyone wants to go there, and everyone has these stories about how it might benefit us," said Lee Tien, senior staff attorney with the Electronic Frontier Foundation, a San Francisco-based nonprofit organization specializing in free speech, privacy and consumer rights.
"One of the things you learn in kindergarten is that if you want to play with somebody else’s toys, you ask them," Tien said. "What is distressing, and I think sad, about the big data appetite is so often it is essentially saying, ‘Hey, we don’t have to ask.’ "  For more see

Google explicitly states that they “don’t sell [their] user’s personal information.” However, they make no statement about who owns the information in the first place. Which leaves the door wide open to allow them to utilize that information in their business model (notably the sale of online advertising) without denying or rejecting your claim to it. And while Google very visibly provides information about how to “liberate” your data (, it has become common knowledge that the vast services Google provides millions of people every day is paid for, at least in part, by the result of the tacit agreement that Google can use some of the data created by your use of their products to generate revenue in their business model. The question remains open, however, what exactly is the distinction between “personal information” and the set of all information that Google knows about you which, when combined in the right way, could potentially expose enormous amounts of personal information.

In many ways, an organization’s business processes, technical infrastructure configuration, and data handling procedures can be interpreted as a manifestation of their values. (Thanks to Dion Hinchliffe for this intriguing insight)

Seen this way, values are inherently expressed by these data handling practices. While it might not be completely possible to reverse-engineer a company’s values by deconstructing their data handling practices, it certainly is possible to learn more about what has been considered important enough to include by simply reading the policy statement.

And it is fair to assume that, the absence of any particular consideration being expressly included in the policy statement, that consideration was deemed not important enough to include. Without additional information, it’s impossible to know exactly what was considered but ultimately not included or even what those conversations were like. But we can know what ultimately did and did not make it into those statements, and infer some reasonable understanding of what the originating organization deems important.

And in this age of increasing scrutiny and understanding on the part of consumers and users of information technology, what your organization values is inherently being demonstrated by your data handling practices.

This post is excerpted from Ethics of Big Data .

About Kord Davis

Kord Davis is a former Principal Consultant with Cap Gemini and has spent nearly 20 years providing business strategy, analysis, and technical consulting to over 100 organizations of all sizes including: Autotask, Microsoft, Intel, Sisters of Mercy Healthcare, Nike, Bonneville Power Administration (BPA), Northwest Energy Alliance (NEEA), Bill & Melinda Gates Foundation, Western Digital, Fluke, Merix, Roadway Express, and Gardenburger. Integrating a professional background in telecommunications and an academic background in philosophy, he brings passionate curiosity, the rigor of analysis, and a love of how technology can help us do the things we really want to do better, faster, and easier. A formally trained workgroup facilitator, he holds a BA in Philosophy from Reed College and professional certifications in communication, systems modeling, and enterprise transformation.