The New Google Patent & The War on Link Spam
Google filed a new patent on the 26th April entitled 'Document Scoring based on Document Inception date'. The patent, authored primarily by Google's Head of Webspam, Matt Cutts, is Google's latest patent aimed at protecting the methods they employ, or plan to employ, to better rank websites in their search results.
The patent is important as it comes at a time where Google is finding it increasingly difficult to establish a robust way of dealing with the paid-for links phenomenon that has resulted in Google losing some control over the integrity of their results in recent months. Essentially, Google declared war this month against websites that were purchasing links to directly improve their rankings in its search results. Buying links that point to a site works in raising that site's rankings because a major part of Google's algorithms analyse links as if they were 'votes' in favour of the site receiving those links. The more links that point to your site, the better you'll rank.
Taken to the extreme, buying thousands of links will allow you to leap over your competitors and rank preferentially for whatever you like. But before rushing off to buy as many links as your closest link broker has to offer, you need to know that Google is winning the latest battle in that war. We've seen a number of major sites drop in Google rankings quite dramatically over the last few weeks, by virtue of Google manually penalising them for their link buying or heavily dampening the value of their bought links. There has also been an instance where the Google penalty meant a particular site was not ranking for even its brand name, the ultimate nightmare for a marketer or brand guardian.
This war on links has been a long time coming. Danny Sullivan, a respected industry commentator, commented last year that the Google ranking algorithms' reliance on links was like a 'Pandoras Box', its reliance on links being something it would not be able to put back into Pandora's Box if and when link engineering became more aggressive. At the time this was more levelled towards the infamous 'link bombs' where people had linked to official websites to make them rank for things they weren't really about. George Bush's biography on the Whitehouse site ranking for the words 'miserable failure' is the most famous example. Google put a stop to that by looking at whether sites were mentioning the words that the links suggested a site should rank for on their pages, and if they didn't the bomb was diffused. Round 1 to Google.
However, Danny Sullivan's comments remained pertinent even after Matt Cutts and his team defused those 'bombs'. People began purchasing links to create some of their own rankings, exploiting the same part of Google's algorithm, but this time without the lack of on-page referencing of the search term being promoted to identify it as a bomb. Google's response to that in the last few months has been to manually review search results in the Finance and Travel verticals and identify sites that are ranking through that technique and manually dropping their rankings. Google have also set up a method for people to report sites that are selling links too, adding a proactive input into their process of manually reviewing suspect search results. Sites that were relying on paid-for links for their rankings saw significant drops in rankings. Round 2 to Google….kinda.…
The problem here is that this may well represent a Pyrrhic victory for Google as rather than being an algorithmic response to spamming, it is a human one which relies on Google employees deciding what represents excessive link buying and what does not, who should rank and who should not, and also relies on webmasters shopping others for their link buying activities. All this adds a subjective layer to Google's ranking process. Subjective, for me, isn't very impressive. It suggests that Google has no mechanism to automatically identify and deal with this problem. Whilst they have indirectly suggested otherwise in official blogs, link spam still dominates their search results, and their latest patent which deals with how to value links, fails to convince that there is an algorithmic response waiting in the wings.
Furthermore Google has been promoting usage of the rel-nofollow attribute (which precludes the site from passing on link juice) asking webmasters to use it for links that they may have sold to other sites. Again, a human response to what should be the domain of mathematics…surely.
Okay, the patent itself. Given the above issues, you would think that there is a need for Google to step up a gear, algorithmically, to deal with these ongoing challenges. The patent covers various factors to help determine the value of a particular link. These are:
a. The date of appearance/disappearance of a link
b. The age of new links
c. The level of authority a site doing the linking has
These factors would allow Google to view a site's inbound links as a trend line, looking at the rate of link accrual over time to determine freshness or staleness of a document. Whilst this is useful, it is not an algorithmic solution to the problem. This approach rewards sites for displaying 'healthy' linking activity, presumably to allow good sites to rise in rankings more rapidly than average sites - the principle of reward that allows the 'cream' to rise to the top. The suggestion then is that Google's next evolution will be based on a model of reward and punishment, whereby the algorithm rewards the activities that it perceives as natural and human intervention punishes the activities that Google employees perceive as manipulative.
The implications of this are two-fold. Firstly, it means that SEO practitioners need to evolve in the way they view link building as it has and will become increasingly more difficult to undertake link building campaigns successfully and safely. There will be no short-cuts, instead relying on the link builder's ability to plan and deploy a 'blended' link building approach, integrating multiple methods on an ongoing basis, including harnessing PR, Social Media, and any assets their clients may have. It will create a barrier to entry too, as the time and resources to do this strategically, intelligently, on a constant basis, will not be feasible for most site owners. Search results in the next 12 months will be dominated by sites doing this better than their competitors.
Secondly, websites wishing to rank on Page 1 of Google will need to take a long hard look at their online propositions. With Google policing the web, a site ranking on Page 1 of Google should be able to justify why it is there through its link profile. Sites with USPs will invariably attract links, sites with online assets will be able to link bait, sites giving something back to the wider community (such as research or tools) will attract links too. Sites ranking because they have lots of links will not be enough - there will need to be a reason why sites have linked to you beyond their receipt