A new graph has been making its way around the social media circles, part of an excellent white paper about long-tail CTR. While the piece was shared by some very smart people in the industry, and people that I have a lot of respect for, this kind of report makes me cringe because its implications are so often misunderstood. Here’s the graph:
As is true with any data, there’s plenty of room for misinterpretation. I think many people get the wrong takeaway from this graph. Let me explain:
The wrong takeaway: “We need to get our site to the top of the search results, so we can get that click-through rate.”
That’s not what this graph means—it’s descriptive, not prophetic. You do want to get to the top of the search results, but a graph like this (or the similar ones from Optify or Enquiro) may be reflecting different types of search behavior that don’t actually benefit your site. I’m thinking of navigational searches and hyper-specific searches.
Navigational searches are the times when Google takes the place of bookmarks. We know the site we want to go to, but we have forgotten the specifics of the URL. I’ve personally looked over the shoulder of people who do this for everything, searching for “Google Docs” or “ESPN.” To drive this home, realize that every month there are 3 billion—with a “B”—Google searches for just the term “Facebook.” And there’s another 124 million hapless searchers each month typing in “facebook.com” as a search term. There’s even 30 million searches per month for “google.com,” searched for from Google search itself. In any of these cases, the first search result is always the right one.
Hyper-specific searches are just what they sound like, and are the cousin of navigational searches. For example, if I search for “HP Photosmart 4280 driver,” it’s clear that I’m searching for something specific, and it’s always the first search result—the first match in this case likely gets 90% of clicks, and the other results are useless.
The graph is still right, but the hazard is assuming that all search behavior is the same. When looking at these click-through rates, they make a lot more sense for generic search terms (“sports news”) than for something specific the searcher might have in mind (“ron artest hair infographic”).
The right takeaway: “We need to make the content on our site more relevant, so Google will reward it with higher placement.”
With Google’s algorithm changes over the last year, most notably its Farmer/Panda releases, a lot of the gaming of SERPs is gone. And that’s a really good thing.
In the good ol’ days of SEO (and this was white-hat SEO, too), human readability was removed from pages in order to make them more search engine friendly. A site title of “Doctors in Phoenix” was tweaked to be “Doctors Phoenix” to more closely match a high-volume search term. Individual pages were set up to capture specific search terms (“pediatricians phoenix,” “dermatologists phoenix”), with each page comprising a few paragraphs of keyword-rich text—pages that provided absolutely nothing useful to anyone who landed there. The good news is that Google is increasingly recognizing their treachery and penalizing them for it.
It sounds like an insurmountable task, but Google is legitimately rewarding real, human-relevant sites more than ever before. Our focus as marketers and business owners has to shift away from nit-picky details and loopholes of the Google algorithm back to people, and how real human beings consume content.
Ultimately, it would be just as fallacious to take this concept too far and assume that search engine rankings don’t matter—they matter immensely, and improving rank for a key search term can be an enormous boon to a company’s bottom line. The point, rather, is that our approach has changed. We should focus first on how humans will consume our site, and let Google reward us for the relevancy of our content—rather than focusing on Google itself. Once we do this successfully, we’ll get those terrific click-through rates.
And I’ve got a graph that can tell you all about it.