Blog

Soft Ranking Signals Will Dominate the Near Future of Google Search

Posted by:

*

When looking at the most recent changes of Google’s ranking algorithm for it’s organic search you will notice that it’s not just about the algo anymore. In 2011 it became obvious that Google takes more and more usage signals into account when ranking websites. The recently leaked Google quality rater guidelines prove that as well. The 125 pages long PDF document often mentions “usefulness” or in other words usability as a major factor to determine whether a site is spam or not.

In case you didn’t know yet: For years Google has employed so called quality raters, that is real people who view and rate search results.

This year, with the so called Panda update, the quality raters were asked to identify spam or low quality sites beforehand and based on their feedback the algorithm has been “improved” or the rankings of particular websites have been lowered in most cases.

In its recent pr campaign to convince users that Google cares for privacy Google declares on their “Good to Know” microsite that

“by analyzing the search logs of millions of users in aggregate, we can continually improve our search algorithm”.

What does this mean from the SEO perspective? They look at actual searches people make and whether they seem satisfied with the results. They check whether users actually click the results they see and then whether searchers stay on the page they clicked on or instead return to the Google results page to try other results or searches. When a 1000 people ignore a result in #1 or return immediately to Google after clicking it it might end up lower in the search results next time.

These are just some of the ways Google gets gets data it can use to derive quality signs from pages: The Google Chrome browser sends usage data all the time to Google even when you don’t click anything.

Google admitted earlier this year that they use Google toolbar data to improve search quality.

  • Google Analytics
  • Google Webmaster Tools
  • DoubleClick tracking cookies
  • the search history of logged in users
  • Google cookies
  • Google +1 buttons

and all kinds of other smaller data sources allow Google to collect a myriad of signals the search giant can harvest.

On the other hand links, the old king of ranking signals have in recent years become less and less accurate and important. People trading and buying links were making web links less reliable to measure website authority. Actual users link less and less and use social sites like Facebook, Twitter or LinkedIn instead. Google can’t access these links (likes or tweets) directly.

So it’s obvious that in the near future soft ranking signals based on actual usage will dominate the Google search algorithm.

This data is difficult to manipulate and it does not depend on webmasters but on the search users themselves. You can buy links but you can’t buy lower bounce rates or higher click through rates. Of course you could trap users on your site and force them to stay longer or perform actions that prevent them from bouncing but I’m pretty sure you won’t get away with such a tactic for long.

 

How can you optimize for soft ranking signals?

Well, you need to extend the scope of your website optimization techniques to those that were already part of it in the past before SEO got more and more focused on hard algorithm based ranking factors like the number, authority, position or anchor text text of links. Long before Google announced that it considers website load speed to be a ranking factor website optimization was about making files as small as possible to alleviate the drawbacks of the World Wide Wait as the early modem driven Internet was referred to.

You can learn even more from the early Web.

There were no heavy animations or complex scripts. Likewise users and thus Google can’t stand obstacles in their way when wanting to reach a goal on a page. They want the information they seek quickly. In case they can’t get it or even see it due to excessive clutter you lose them. The big difference us that Google can take these factors into account by now. Some of them don’t even need actual people aka quality raters to look at your site. Some metrics like the ads to content ratio that is one of the more important factors after Panda can measured without human interference especially as the ads are served by Google as well in many cases.

On the other hand many of the new major soft signals are really soft and can’t be computed by robots only so Google will extend the amount of user testing. You might assume that they can’t test all the billions of sites out there but after the Google we have witnessed how many leading sites in specific niches have been obviously reviewed and reranked accordingly. Some sites that have been mistakenly downranked have been manually reviewed again and reinstated.

So while your site will be most probably spidered by robots again it might get manually visited by someone who will look at it and make rank better or worse after flagging it either useful or spam. So you better approach SEO with this perspective in mind. Just imagine a friendly but clueless¬† Google intern visiting you. Make sure s/he understands what your site is about and how it help to accomplish the task it’s optimized for.

Make your site

  • fast
  • light-weight
  • clean
  • to the point
  • readable
  • enticing
  • in-depth

to keep on ranking high. Google just mimics real user behavior.

 

* CC image by cobalt123.

 

Be Sociable, Share!

0

Add a Comment

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close