In light of the recent WikiLeaks saga and the various leaks that have plagued my former employer, I was musing the other day about whether leaks are inevitable as an organization grows.
I started off by considering a model where each individual in an organization leaks a particular piece of sensitive information with a constant probability p, and where acts of leakage are independent and identically distributed events. Now let’s consider what value of p leads to a 99% probability of leakage in an organization of n = 20,000 people. It’s less than 1/4000. In other words, even if each person in an organization can keep a secret with 99.98% reliability, almost all secrets will be leaked.
Using this same value of p with n = 900 (roughly the size of my current employer) yields less than a 20% chance of leakage — certainly not a zero probability, but much closer to zero than to one. And at n = 90 — the upper end of what I’d consider a startup — the probability of leakage drops to 2%. Based on this crude analysis, the ability to keep secrets drops very rapidly as organizations enjoy the growth that comes with success.
Moreover, p is likely to be positively correlated to n — that is, individuals in larger organizations are more likely to leak sensitive information. Many people in larger organizations have less actual and perceived stake in the organization’s success, than those in smaller ones. Also, it is difficult to sustain grueling hiring standards — particularly cultural ones — as an organization grows.
So what is an organization to do? If the above model is even close to accurate, then I can see four options:
1) Don’t grow.
Yes, I’m serious. Not every idea inspires a billion-dollar business, and not every company should grow beyond a hundred people. Growth has costs that offset its benefits, and the inability to keep secrets may be a significant cost for organizations whose competitive advantage depends on proprietary intellectual property. The largest hedge funds each have about 1,000 employees, and most are much smaller. Secrecy is not the only consideration, but it’s certainly a consideration.
2) Share less with your employees.
If you can’t reduce p, you can at least reduce n by sharing secrets less widely. Traditional organizations only share sensitive information within a tight inner circle. Even Google, known for sharing almost everything with its employees, keeps tighter control over the details of search result ranking. This approach, however, comes at a cost: it signals to employees that they cannot be trusted. Moreover, if employees discover secret information through rumor, they may feel less responsible for maintaining secrecy than if they had been entrusted with that information.
3) Investigate leaks and punish leakers.
Some organizations succeed better than others at rooting out leakers and punishing them. In economic terms, it makes sense to discourage undesirable behavior through strong disincentives. Note, however, that leakers rarely gain anything tangible in exchange for their leaks and indeed are often acting irrationally in strictly economic terms. People in general have been known to act irrationally. So I’d caution against any approach that assumes human rationality. A better approach may be to detect or prevent of leaks through technology (e.g., packet analyzers), but see the previous comment about making employees feel they cannot be trusted.
4) Keep fewer secrets.
A prominent CEO recently said “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place“. Yes, I’m taking the quotation out of context, but I’d like to offer a variant: if your organization’s success depends on something that you don’t want anyone to know, maybe you should reconsider your business model. Less glibly, you should avoid unnecessary dependence on secrecy, and you should avoid labeling all corporate information as secret, since that desensitizing employees to the risks of disclosure.
Conclusion? As Ben Franklin said, “Three may keep a secret, if two of them are dead.” Organizations can and do manage to keep secrets. But it’s hard to fight human nature, and better not to rely on winning that fight.
7 replies on “The Secret May Be To Keep Fewer Secrets”
The question is what is the minimal amount of memos/emails that need to be protected?
I would argue that for many organizations that minimum is still quite high so your option number 4 is not entirely practical.
In this category are those candid internal communications which by their very nature are not meant for public consumption. I don’t see how any organization can have any honest discussions if everyone knows their comments are very likely to become available to anyone with an internet connection.
Some other drastic options you haven’t mentioned are to
(5) Avoid electronic communications altogether.
(6) Periodically destroy records.
(7) Adopt draconian policies ala CIA or NSA (All information kept on work machines, strict firewalls to the outside world etc.)
None of these are completely satisfactory.
Here are a couple of other links I saw that were kind of interesting:
I think it’s already the case that anyone writing an email anticipates a possibility of its being widely forwarded — especially if it’s an email with multiple recipients. Depending on thousands of people knowing something but keeping it secret is at best a high-risk gamble. Sometimes it works (e.g., Googlers kept the existence of Google’s self-driving cars until the company made an official disclosure). But the more common case is that at least someone in an organization that large leaks sensitive information.
As for your other options, I’d consider them variations of what I suggested in 3): “detect or prevent of leaks through technology” — or, as you’re getting at, lack of technology. In practice, making the spread of information within a company less efficient (i.e., using complexity as friction, as Lance aptly puts it) starts to sound like 2).
I like Dick’s “fake document” idea — I thought about it, but not so formally. The problem is that, as Dick notes, the fakes would have to seem to be real, or they would not be of any use. I’m not convinced it’s that easy to create compelling misinformation — if it were, I suspect we’d see far more of it pervading the financial markets.
“I’m not convinced it’s that easy to create compelling misinformation — if it were, I suspect we’d see far more of it pervading the financial markets.”
Due to high frequency trading, perhaps every single transaction trades on some amount of misinformation.
Point taken — I hadn’t really considered time as a variable. I concede it’s possible — perhaps even easy — to disseminate misinformation that does real damage before it is debunked, e.g., the UAL bankruptcy story a couple of years ago. But I think the points in my post–including the difficulty of creating compelling misinformation–apply to less time-sensitive information.
[…] hard to come up with good interview problems, and it’s also hard to keep secrets. The secret may be to keep fewer secrets. An ideal interview question is one for which advance knowledge has limited value. I’m […]
In Security Engineering, the observation that secrets don’t last, especially in a competitive environment, is quite ancient.
It is called “Kerckhoffs’ Principle”, from Auguste Kerckhoffs’ book on military cryptography published in the 1880s.
Not claiming originality here, but it seems this is a lesson that bears repeating. Thanks for the citation — Wikipedia link here for anyone interested.