Categories
General

Can We Build a Distributed Trust Network?

http://blip.tv/play/AYHOqGwC

Mathew Ingram posted an interview with Craig Newmark (the Craig of craigslist fame) in which the latter argued that what the web needs is a “distributed trust network” to manage our online reputations. As it happens, this is an idea that has occupied me for several years. So I figured it was about time that I shared my thoughts on the subject.

When we think of how trust works online, two of the most prominent examples are Google’s PageRank measure and eBay’s feedback scores. But neither of these measures addresses what I think Craig has in mind. PageRank is a great way of using citation analysis to determine the most authoritative citations, but the trust in a page should consider its out-links (i.e., can we trust the page not to point us to untrustworthy ones?) and not just its in-links. eBay’s feedback scores have a different problem: they count positive and negative ratings without considering the social network of buyers and sellers–and approach that is vulnerable to fraud through shill ratings. Incidentally,LinkedIn recommendations have a similar weakness if viewed in strictly quantitative terms, but the potential for abuse is mitigated by the endorsements being signed–and by their being more than just binary or numerical ratings. Incidentally, here’s a site you can use if you’re too lazy to actually write the recommendations yourself.

But I digress. Propagation of trust does seem like the perfect application to build on top of social networks. Consider any problem that involves getting advice to inform a decision. If we regularly solicit advice from our first-degree connections, then we should be able to learn over time whose advice we can trust. We can then vouch for these connections, which offers the connections who trust us a basis for trusting their second-degree connections through us. And so forth through our social network. Of course, trust is not irrevocable: loss of trust should propagate similarly.

I’ve talked about this problem with two of the leading experts on social networks, Jon Kleinberg and Prabhakar Raghavan, and as far as I know no one has built a system along these principles. In economic terms, I envision a system where a person’s reputation truly is his or her coin. One person might think of bribing one another to exploit the latter’s established reputation, but a rational person with a strong reputation would demand an exorbitant bribe to put that reputation at risk.

Of course, a lot of information would have to propagate throughout the social network–and be stored–for this system to work. Regardless of how the information is abstracted, such a reputation index would raise thorny privacy issues. Nonetheless, I don’t know if we can build a reputation system that is entirely privacy-preserving–since reputation is an inherently public mechanism. In addition, any such system would have to consider the implications of defamation laws. These are some major hurdles!

Nonetheless, I agree wholeheartedly with Craig that a distributed trust network could be “the killingest of killer apps”. I just hope we can find a way to build and use it!

Note: Chris Rines suggested I look at Advogato’s Trust Metric, and a quick investigation led me to the Wikipedia entry for trust metric. Looks like I have some homework to do!

By Daniel Tunkelang

High-Class Consultant.

17 replies on “Can We Build a Distributed Trust Network?”

One problem is that negative statements are almost always issued with greater frequency and ferocity. A single “hate site” can overwhelm a legion of positive comments, especially if the owners of the site are fervent haters of their target and skilled in SEO, which the average person isn’t. These sites typically moderate their comments so it is impossible to unveil their biases and misleading statements.

Like

Joe, Stack Overflow’s system may work better than eBay’s, but I think that it has the same vulnerabilities in theory. Unlike eBay, there is less incentive for abuse.

Neil, you’re right that abuse can take on either polarity. That’s why it’s essential that trust not simply be a matter of counting the votes of strangers. A friend pointed me to WikiTrust, which made a great point in its FAQ: “When a user A reverts a user B, the reputation of B suffers in proportion to the reputation of A. Vandals usually have no reputation (if they are anonymous), or very low reputation, so the reputation of B would suffer only minimally.” I think the same principle should apply to the problem you describe. Even better would be a mechanism to propagate trust through the social network.

Like

I have been working on this problem as well after experimenting with reputation networks and true value over the last 5 years. What’s missing in most reputation rating systems is a sense of RARITY — we never narrow it down to those we trust the most, so all bumps become the same. True trust is a rare gift that we only share with a relatively small number of people compared to the vastness of our networks. Ping me off-blog to talk more about this design and how it works and links with the rest of our social media landscape — I got as far as designing it and finding developers ready to build it out.

Like

One way to do this is on an account basis, ala eBay, craigslist, etc. The other way is when people actually be the same person online that they are offline, responsible for their actions in the real world. How about a network where you could only enter as yourself authenticated via fingerprinting? That way, when you ding someone or prop them up, at least we can be sure they are coming from real people. It could be anonymous, but validated, like our voting system.

We need the laws that build systems of trust in the real world to be reworked for the digital era, laws that make people responsible for their online behavior. Rules of interaction are the cornerstone of effective communities.

Like

For the sake of trust networks, I find it helpful to think of trust as 3 different types:

1. Interpersonal (Are we on the same team?, are incentives aligned?, have each other’s backs)
2. Competence (Is this person capable or failure rates of products)
3. Consistency and transparency (eg, Can we trust these numbers?, statistics, a true reflection of reality? Is this pattern predictable?)

Depending on the relationships and situations involved and the work that’s done, people will get different scores on each.

Like

Evonne, I agree re: the strongest circle of trust, but I’m actually interested in extending that circle effectively. For example, I’d like to be able to have something in between a endorsement from a close colleague and one from a total stranger who is a completely unknown quantity to me.

Leonard,I’m a big fan on non-anonymity, and I alluded to its virtues for LinkedIn recommendations. But public disclosure can have a chilling effect on sincere negative feedback. Ideally we’d have authentication without public disclosure of judgements–only their propagation through the social network in a way that sufficiently protects privacy. Assuming this is possible to achieve!

Leonard, you might want to check out this post by Adina Levin entitled “Trust is Contextual” (via Ali Sohani).

Like

I think authentication without public disclosure of judgements is achievable. I think the anonymous comments just have to be metered somehow, much like we each have one vote. You need to be a real person to join and get authenticated when you go on, and then you only have a certain amount of anonymous flame tokens you can use for a certain period. Or maybe, rather than flames, you have only a certain amount of objective (multiple choice, yes/no, 1-10 rating) scores you can give, so there’s no slander, only opinion. Of course, you could also provide public subjective responses in combination.

Like

Indeed, we seem to do ok in the authentication front with electing politicians through secret ballots. The tricky part here is that some implicit disclosure is actually needed to make trust propagate through a social network.

Like

Should we put our trust in any website? The truth is, that all websites have filters, anonymus content writers and ‘paid’ contributers. Even the ‘haters’ usually are exagerating the bad customer service or fraud based on how much time they have to forumlate a complaint. Craigslist is a top ten website in the country and their community is one of the most fraudulant on the web. Good luck trying to build a trusted network when human nature is involved.

Like

Carl, I understand where you’re coming from. But let me ask you this: do you trust weather sites? I’m not saying they’re 100% reliable, but I suspect you make real decisions based on forecasts. I suspect the same holds for other online information sources. Subjective content is obviously a lot harder to trust, even if it’s presented in good faith, but I don’t think it’s black and white. If we don’t trust anything, we cut ourselves off from useful information. As with all things, there’s a precision / recall trade-off.

Like

[…] reputation space. My curiosity should be no surprise to folks who have read my recent posts about distributed trust networks and solicited reviews. Anyway, I decided to go straight to the source and persuaded Unvarnished CEO […]

Like

Hi Daniel,

The concept of distributed trust is very interesting. Most people have more than 1 social media account. They sometime create different profile in each account.

It will be useful to have a holistic view on how many ‘Trust’ points that a person accumulate across these different site.

This kind of system can benefit the profile owner and both profile reviewer who are interested to evaluate their credibility before engaging in a real relationship with them.

I look forward for your next post

Like

People may have lots of profiles, but I don’t how many of those are associated with trust scores. I think that the social networking sites started from a premise that you could trust your friends. Now that “friends” has been diluted as a concept, we need to rethink our approach to inferring trust from the social graph.

Like

[…] Can We Build a Distributed Trust Network?Mar 20, 2010 … Propagation of trust does seem like the perfect application to build on top of social networks. Consider any problem that involves getting advice … […]

Like

Comments are closed.