Privacy is the third rail of the cloud. On one hand, the ease of sharing information and the power of analytics have produced extraordinary value for consumers, as well as great business models for companies that serve those consumers. On the other hand, people have good reason to worry about the unintended consequences of over-sharing.
When I attended the O’Reilly Strata New York Conference in September, I had the pleasure to hear and meet Intelius’s Jim Adler talk about being his company’s “accidental chief privacy officer”. Intelius‘s main product is people search — an area that naturally brings up privacy concerns. Especially since Intelius aggregates and publishes information about people from databases of public records, eroding a history of “privacy through difficulty“. Impressed with Jim’s talk at Strata, I persuaded him to deliver a similar talk at LinkedIn, the video of which you can find above. You can also find his slides on SlideShare.
Jim brings nuance to the discussion of privacy — nuance that discussions of online privacy often lack. For example, he responded to the recent controversy about social networks’ “real names” policy with a measured post entitled “Nyms, Pseudonyms, or Anonyms? All of the Above“.
Jim appropriately opened his talk by disclosing a personal example. He shares his name with a more prominent personal injury lawyer who dominates search results for that name, raising the potential of taint by association. Intelius’s core technical problem is to cluster inputs from the sources it aggregates, thus mapping each person to exactly one record in its database.
Jim went on to note that we are at a stage in the privacy debate where we are likely to see more regulation. He makes a few key observations:
- Social norms, which form the basis of our laws and regulations (the notion of a “reasonable expectation of privacy) have changed suddenly, leading to a “privacy vertigo” where suddenly the whole world now feels like a small town.
- Sharing is a gateway from private to public, which often leads to violation of expectations. This problem is not new, but the efficiency of online sharing dramatically amplifies the unintended consequences of sharing. It is crucial that the parties involved in sharing data also have shared expectations around how that data will be used or disclosed.
- We need to distinguish between data use and data access, and not to try to regulate data use with data access regulations. He cites the Fair Credit Reporting Act as one of the most inspired laws of the last 40 years to regulate data use. If you don’t have time to listen to the whole talk, I recommend you jump to 25:12, where he discusses this law in detail.
There’s a lot more in the talk, so I’m not going to try to summarize it all here. I strongly encourage you to check out the video (which includes lengthy Q&A) and the slides. Better yet, let’s use the comments to discuss!