Google’s new privacy policy confirms CTR use in algorithm

Scheduled to come into effect on March 1, Google's much publicised new Privacy Policy and terms, which it says will give users better search results, ads and improve their experience overall, have come under scrutiny from the European Union's (EU) data protection authorities. They have asked for rollout to be delayed in order to fully verify the terms are not in breach of EU data protection laws.

Google explains the main change, which is for users with Google Accounts, as follows:

"Our new Privacy Policy makes clear that, if you're signed in, we may combine information you've provided from one service with information from other services. In short, we'll treat you as a single user across all our products, which will mean a simpler, more intuitive Google experience."

From an SEO perspective there's nothing new here, although it may well be the most transparent Google has ever been about utilising CTR (click through rate) data in their natural search algorithm.  This is something that we've suspected since Google was granted the rank adjusted content items patent that suggested click patterns could be used in various ways to inform rankings.

In the privacy policy, Google says (my emphasis in bold):

"To bring you the most relevant results for each search query, we look at the usage patterns of millions of people using Google every day.  It is by analyzing these search patterns via our logs data that our engineers are able to improve the search algorithms that determine the order in which our search results appear. If our engineers can see that people are consistently clicking on the top result for any given query, they know they are doing something right. If people are hitting "next page" or typing in another query, they know they're not delivering the results that people are looking for, and can then take action to try and improve the search algorithms."

This serves as a reminder, if one were needed, of the importance of the following (in this order, I'd suggest):

- Using microformatting to create "rich snippets" that enhance your search engine result page (SERP) listing with additional information such as product listings, comments, post counts, ratings and reviews

 - Optimising your title and meta description so that they are compelling and readable, not just keyword rich

 - Having a clean, concise URL structure so that users can see from the display URL that the page they are going to is the right one

Related to this, some of you may have read my blog post on Google ranking fluctuations a little while back. There's an interesting link between the cyclical rank fluctuations we saw in our experiment (what we termed "Algorithm Testing or User Testing Fluctuations") and Google using CTRs in its algorithm.  In that post I wrote:

"The alternative scenario here is that Google is testing how well users respond to a particular page ranking in the search results by artificially improving its ranking for intermittent periods, looking at user signals like CTRs, bounce rates, engagement time and so on."

Google's transparency over using CTR adds a lot of credibility to this theory (not that knowing this is much use to anyone, since by the time you identify cyclical rank fluctuations Google has probably already tested your CTR performance).

Filter posts

Powered by

Back to Top

© Copyright 2018 Greenlight. All Rights Reserved Terms & Conditions