The Noisy Channel

 

Life, the Universe, and SEO Revisited

February 26th, 2011 · 24 Comments · General

A couple of years ago, I wrote a post entitled “Life, the Universe, and SEO” in which I considered Google’s relationship with the search engine optimization (SEO) industry. Specifically, I compared it to the relationship that Deep Thought, the computer in Douglas Adams’s Hitchhikers Guide to the Galaxy, has with the Amalgamated Union of Philosophers, Sages, Luminaries and Other Thinking Persons.

Interestingly, both SEO and union protests have been front-page news of late. I’ll focus on the former.

Three recent incidents brought mainstream attention to the SEO industry:

  • Two weeks ago, Google head of web spam Matt Cutts told the New York Times that Google was engaging in a “corrective action” that penalized retailer J. C. Penney’s search results because the company had engaged in SEO practices that violated Google’s guidelines. For months before the action (which included the holiday season), J. C. Penney was performing exceptionally well in broad collection of Google searches, including such queries as [dresses], [bedding], [area rugs], [skinny jeans], [home decor], [comforter sets], [furniture], [tablecloths], and [grommet top curtains]. As I write this blog post, I do not see results from jcpenney.com on the first result page for any of these search queries.
  • This past Thursday, online retailer Overstock.com reported to the Wall Street Journal that Google was penalizing them because of Overstock’s now discontinued practice of rewarding students and faculties with discounts in exchange for linking to Overstock pages from their university web pages. Before the penalty, these links were helping Overstock show up at the top of result sets for queries like [bunk beds] and [gift baskets]. As I write this blog post, I do not see results from overstock.com on the first result page for either of these search queries.
  • That same day, Google announced, via an official blog post by Amit Singhal (Google’s head of core ranking) and Matt Cutts, a change that, according to their analysis, noticeably impacts 11.8% of of Google search queries. In their words: “This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

Of course, Google is always working to improve search quality and stay at least one step ahead of those who attempt to reverse-engineer and game its ranking of results. But it’s quite unusual to see so much public discussion of ranking changes in such a short time period.

Granted, there is a growing chorus in the blogosphere bemoaning the decline of Google’s search quality. Much of it focused on “content farms” that seem to be the target of Google’s latest update. Perhaps Google’s new public assertiveness is a reaction to what it sees as unfair press. Indeed, Google’s recent public spat with Bing would be consistent with a more assertive PR stance.

But what I find most encouraging is the Google’s recent release of Chrome browser extension that allows users to create personal site blocklists that are reported to Google. Some may see this is as a reincarnation of SearchWiki, an ill-conceived and short-lived feature that allowed searchers to annotate and re-order results. But filtering out entire sites for all searches offers users a much greater return on investment than demoting individual results for specific searches.

Of course, I’d love to see user control taken much further. And I wonder if efforts like personal blocklists are the beginning of Amit offering me a more positive answer to the question I asked him back in 2008 about relevance approaches that relied on transparent design rather than obscurity.

I’m a realist: I recognize that many site owners are competing for users’ attention, that most users are lazy, and that Google wants to optimize search quality subject to these constraints. I also don’t think that anyone today threatens Google with the promise of better search quality (and yes, I’ve tried Blekko).

Perhaps the day is in sight when human-computer information retrieval (HCIR) offers a better alternative to the organization of web search results than the black-box ranking that fuels the SEO industry. But I’ve waiting for that long enough to not be holding my breath. Instead, I’m encouraged to see a growing recognition that today’s approaches are an endless game of Whac-A-Mole, and I’m delighted  that at least one of the improvements on the table takes a realistic approach to putting more power in the hands of users.

24 responses so far ↓

  • 1 Dmitry Shaporenkov // Feb 26, 2011 at 3:58 pm

    Don’t you think, Daniel, that the ranking problem will always be there, no matter if it’s HCIR or the classic boring “ten blue links” approach to search? After all, in most situations you still have multiple results answering the query, even after filtering has been applied…

  • 2 Daniel Tunkelang // Feb 26, 2011 at 4:05 pm

    Not necessarily. For example, if you and I prefer different retailers that overlap heavily in their inventory, then why not just show each of us our preferred retailer? Better yet, make it easy for the user to establish such preferences and communicate them to the search engine.

    Not all situations are so clear cut, but many of the SEO battles are over commodity content. User control strikes me as a lot better than the current approach of competing for a mostly global rank.

  • 3 Dmitry Shaporenkov // Feb 26, 2011 at 4:12 pm

    Even in your shopping example it still makes sense to show all retailers to both of us, just rank them differently. It thus actually sounds more like a personalized ranking. And I agree that if ranking were really personalized, SEOs and others would have fewer incentives to game it, for they wouldn’t know how to do that for every user.

  • 4 Daniel Tunkelang // Feb 26, 2011 at 4:29 pm

    Not at all! For example, I buy almost all of my jeans from Old Navy. I also prefer men’s jeans in my size. If I search for jeans, I’m likely to want http://www.oldnavy.com/products/mens-jeans.jsp but I might also be interested in a few other sites. But I hardly want to see all retailers who sell jeans — I’ve already established my preferences.

    Perhaps your argument holds when I enter a very specific product, e.g., the name of a book. There it makes sense to have personalized ranking based on price, shipping time, my retailer preferences, etc.

    But I think we agree on the essential point, which is that a highly user-centric approach, regardless of how much is uses ranking, strongly disincents gaming and thus overturns the SEO dynamic.

  • 5 Greg Lindahl // Feb 26, 2011 at 7:44 pm

    Daniel, you could always create a slashtag at blekko containing your favorite retail websites.

  • 6 Daniel Tunkelang // Feb 26, 2011 at 8:11 pm

    Point taken: this is a nice use case for slashtags, or for the site: tag on Google or Bing.

    But, like other lazy users, I’m unlikely to use slashtags on a regular basis. But I think it’s reasonable for me to expect the search engine to guess the general class of query intent (e.g., retail transaction) and then use my custom settings for that query class.

    I’m surprised no one’s tried to do this in browser extensions. Or perhaps someone has?

  • 7 Don Turnbull // Feb 26, 2011 at 8:31 pm

    I’d like to expand your discussion to divide up the kinds of searches users might be doing, just to give us more to be specific about: shopping vs. informational searching and their possible affects on habit and perhaps to personalization as well.

    1) Shopping – to oversimplify: where a google ad or top-ranking result might have a much higher chance of being selected, with an associated payout for google.

    a) I like both examples you give here, but would it be more practical to assume that a rational (non-habitually going directly to Google for a search) user behavior would be “habitually personalized” to just go directly to OldNavy.com to buy jeans or to Amazon.com to buy a book? If that is the case, it’s sensible to not care too much about ranking, unless you’re looking for another vendor, a sale or something else special or atypical in your shopping behavior.

    b) Maybe it *is* worthwhile to worry about skewed result listings as if a search is new for the user, that web site they select and visit might become their (new) preferred vendor thus enacting a “lifetime” of preferences. That would be advertising gold with a massive lifetime ROI for that paid ad or SEO optimization.

    2) Informational – where a user may be searching for a fact, or at the beginning of an information seeking session.

    a) Would personalization here be less important in helping to combat SEO if it is a topic that the user has no history searching for and has little commercial value for advertisers? (e.g. “Abraham Lincoln’s birthday”)

    b) A common concern I’ve had since 1998 or so is that personalization may lead to a lack of serendipity or worse yet, a loss of result (topic) diversity in SERP items, especially on the first page. I know many have discussed and researched this in the past and Google and others have made significant efforts to avoid this, but surely fighting SEO is a wild-card in this goal.

    Certainly for #2, HCIR could yield significant increased user satisfaction. How about #1 extended to your further comments?

    (Just a few thoughts, quickly typed.)

  • 8 Daniel Tunkelang // Feb 26, 2011 at 5:57 pm

    Don, thanks for chiming in!

    I’ve often wondered why browsers don’t try to short-circuit search for queries with strong intent signals — especially when that intent is commercial. For example, I’d seriously consider a simple extension that automatically routed my searches to Amazon when my queries seems to show shopping intent and routed my searches to Google otherwise. I’d think Amazon would be incented to create and promote such an extension for all browsers. Of course, there are other intents too — and more places to shop than Amazon — but you get the idea. And I’m sure folks developing apps get the idea too — perhaps this is where the iOs search experience will go someday.

    I can’t imagine that search engines want this to happen, since it would bypass them (and their ads) for a large (and value) class of queries. But it seems like a good idea for users, who could still do long-tail searches on general search engines when they had to.

    would it be more practical to assume that a rational (non-habitually going directly to Google for a search) user behavior would be “habitually personalized” to just go directly to OldNavy.com to buy jeans or to Amazon.com to buy a book?

    I think that users are rational but lazy. I do almost all of my searches on Google through the Chrome Omnibox and rely on ranking as a form of source selection. It’s one less step than first entering the source, and it works a lot of the time.

    I don’t think I’m alone. According to Alexa, Amazon gets over 18% of its traffic from search engines, and it looks like only about 10% of that traffic is people searching for [amazon] or variants.

    You raise a good point about the dangers of a users locking in prematurely to a walled garden. Still, as a user, I’d like that choice. Indeed, I like Blekko’s slashtag feature, though I’d want it to be more ergonomic — and implemented as a Google feature. Yes, I can understand why Blekko isn’t interested in that approach.

    Finally, I’m not too worried about SEO in areas that don’t have commercial value — since SEO players aren’t worried about those areas either. Quality for non-commercial informational queries isn’t perfect, but I don’t think SEO is the problem. Indeed, that is where HCIR might help. But it’s hard to get anyone to invest in improving a search experience that is so hard to monetize.

  • 9 Greg Lindahl // Feb 26, 2011 at 6:56 pm

    blekko has the beginning of a feature that tries to guess your intent: try searching health-related topics like ‘cure for headaches’ and notice that we auto-fire the /health slashtag.

  • 10 Daniel Tunkelang // Feb 26, 2011 at 7:14 pm

    Greg, that is a cool feature. Other search engines guess intent, but it’s really nice to make the search engine’s process transparent to the user.

    The next step (if you buy my argument above) is to allow users to link intent (e.g., looking for health-related information) to a particular search strategy (e.g., only search scientology.org and theonion.com). Perhaps you can already do some of this by letting the user customize the default auto-fired /blekko/health slashtag with their own version of /heath.

  • 11 Carl Eklof // Feb 27, 2011 at 2:11 pm

    Hi Daniel. Aren’t blocklists just another tool like link-farms that SEO engineers will use to game the system to out-rank their sites over competing sites?

    Blocklists seems at least as easy to game blocklists as site links. It’s the same logic, you’d create fake-users instead of fake sites.

    It’s a subtle difference that SEO engineers would be driving down the rank of their competition, instead of trying to improve their own score. At the micro-level, these are the same thing, since the only thing that matters is the relative score. There may be some interesting effects at the macro level, since the SEO engineers would be targeting specific sites to attack.

    I’m not sure how the dollars-for-SEO vs. strength is user-numbers balance would play out. I doubt most users would notice the blocklist functionality, but I’m sure all SEO engineers are developing blocklist tools now.

    Seems like the whack-a-mole game continues.

  • 12 Daniel Tunkelang // Feb 27, 2011 at 2:41 pm

    The block lists are personal, even if Google is looking at them. From Google’s recent blog post:

    It’s worth noting that this update does not rely on the feedback we’ve received from the Personal Blocklist Chrome extension, which we launched last week. However, we did compare the Blocklist data we gathered with the sites identified by our algorithm, and we were very pleased that the preferences our users expressed by using the extension are well represented. If you take the top several dozen or so most-blocked domains from the Chrome extension, then this algorithmic change addresses 84% of them, which is strong independent confirmation of the user benefits.

  • 13 Don Turnbull // Feb 27, 2011 at 10:23 pm

    Greg,

    I saw that blekko would do the recognition of intent and think that’s a great feature too.

    Daniel,

    I too am rational and lazy but my omnibar search experience is augmented by using bookmarks with keywords and %s input strings to automatically search, say Amazon from the omnibar.

  • 14 Tom Aikins // Feb 28, 2011 at 1:32 am

    What’s most interesting is that Google now appears to be feeling the heat from competitors and bad press alike and is going to start doing something about it. It’s great news for small SEOs like me.

  • 15 seo marketing // Feb 28, 2011 at 8:00 am

    I think that you’re absolutely right that Google is always working to improve search quality and stay at least one step ahead of those who attempt to reverse-engineer and game its ranking of results. But it’s quite unusual to see so much public discussion of ranking changes in such a short time period. Thanks buddy!

  • 16 Marjory // Feb 28, 2011 at 10:53 am

    I think that Google (along with many other providers of information results) are too stubborn about seeking a completely technical solution. Why does the ranking process have to completely lack a conscious human element? (I say conscious because the whole linking factor implies some kind of sub-conscious human collective element rather than a systematic one).

  • 17 Keith Brown // Feb 28, 2011 at 12:26 pm

    I still say all things considered Google has done well at keeping results as relevant as possible. Not saying they are perfect, but given the fact that the entire world is trying to game them, I usually cut them a proportional amount of slack. If a smaller SE (blekko) does a better job at handling queries, I’ll chalk that up (first) to the fact that there is very little monetary value in ranking well in their SERPs. Hit them with 100x daily search volume for highly monetized terms and let’s see how they hold up. Not saying they won’t, but it’s apples and oranges at this point. If the entire world began using only blekko for searches on viagra, watch how fast the quality of that search on Blekko drops. Anything is possible with enough money involved, yes even gaming the beloved Blekko.

  • 18 Daniel Tunkelang // Feb 28, 2011 at 8:32 pm

    Don: perhaps I shouldn’t assume that everyone else is as lazy as I am!

    Tom: Google certainly seems to be making a point of reacting publicly and sharply. I’m not sure how that helps the SEO industry in general or you in particular, but wish you the best of luck in these interesting times.

    seo marketing: nice comment spam. Akismet caught it, but I decided to let it through since you’re trying so hard.

    Marjory: the ranking process does have a conscious human element — the code is written by real people, not to mention that both evaluation and ranking signals reflect input from people. I’m not sure what you’re suggesting as an alternative — that Google hire people to come up with the rankings for all the queries its uses make?

    Keith: I agree. As I’ve said elsewhere, I still think Google is the best game in town.

  • 19 Greg Lindahl // Feb 28, 2011 at 10:33 pm

    Keith, take a look at searches like

    cure for headaches

    on blekko. We auto-fire the /health slashtag, and you’ll note that we do not let any non-employees edit the highly valuable /health slashtag… doesn’t matter how many people use blekko, the number of high quality health websites on the internet is small and will remain small.

    A search for plain ‘viagra’ fires /health. Where’s that spam going to come from?

  • 20 Weekly Search & Social News: 03/01/2011 | Search Engine Journal // Mar 1, 2011 at 8:04 am

    [...] Life, the Universe, and SEO Revisited – Noisy Channel [...]

  • 21 searchenginemarketingvox » Blog Archive » Weekly Search & Social News: 03/01/2011 // Mar 1, 2011 at 10:07 am

    [...] Life, the Universe, and SEO Revisited – Noisy Channel [...]

  • 22 Tracy Rexroad // Mar 1, 2011 at 4:59 pm

    I am new at trying to understand what all of you are talking about. Can some please explain how to get better rankings on the SEO and google. Just asking for help. Thank you.

  • 23 Daniel Tunkelang // Mar 1, 2011 at 5:15 pm

    Tracy: I recommend you start by reading the following blogs:

    http://www.seomoz.org/blog

    http://mattcutts.com/blog/

  • 24 Tracy Rexroad // Mar 3, 2011 at 6:21 pm

    Daniel: Thank you for your advice.

Clicky Web Analytics