Greenlight's top 5 predictions for SEO this year

By Greenlight | 23 Jan 2014

1. INVESTING IN THE WRONG TYPE OF SEO

 

 

Adam Bunn, Director of SEO

 

My prediction this year is about companies investing effort in the wrong thing when it comes to SEO. Specifically, I predict far too much emphasis on schemas and not enough on authorship. After I've talked about that, I'm going to open the floor to some of the other smart folk here at Greenlight to talk about what they think will happen in 2014.

 

SCHEMAS

Schemas, at best, are what I'd call a "best practice" - the sort of checkbox exercise you tick off when building

or optimising a site. They are easy to grasp and relatively easy to implement and consequently there is a schema bandwagon rolling through town. Google has put a lot of effort into promoting them, founding schema.org and rather effectively establishing them as a part of SEO to the point that I see a lot of "A-List" SEO's espousing schemas as the next big thing (in fact, it was a 2014 SEO prediction from one of them that it would be essential to optimise your site with schemas that prompted my prediction).

 

The problem is that, except a tiny few that have some proven CTR value, schemas have so far done absolutely

nothing positive for ranks or traffic. It is questionable whether they can even be considered anything to do with SEO and if I had to put money on it, I'd say that the biggest impact of schemas is that Google gets to harvest a lot of information for free and reuse it in the knowledge graph (hence why they promote schema use), which sucks traffic away from brands!

 

But, because schemas are something solid to grab hold of in an otherwise quite ephemeral marketing discipline, I predict companies will continue to invest in their implementation. Fruitlessly.

 

AUTHORSHIP

Conversely, Authorship will be under-invested in, largely because outside of the technical set up of an author in the Authorship programme it's actually very difficult, even intimidating, to approach taking advantage of it. You have to develop a reputation for creating good content within a particular field, and have other people who have a  history of being interested in and/or also having expertise in that field interact favourably with the content you produce.

 

Like schemas, Google also promotes the Authorship programme quite heavily but unlike schemas that is because they genuinely want to use it as an algorithmic signal (as well as, of course, driving more Google+ members and content).

 

If you have any doubt about that, you need only look at the sheer number of patents Google filed and was granted in 2013 to do with the identification and scoring of authors, topics and experts. 2014 will be the year of Authorship, but it's going to fly right by most companies.

 

2. HUMMINGBIRD FOR SPOTTING UNNATURAL LINKS

 

Graham Ridley, Lead SEO Consultant

 

The Google Penguin algorithm update has caused quite a stir since its release in April 2012, as it's the first regular update that looks at a website's backlink profile. This algorithm has struck fear into blackhat SEOs, however, some websites still appear to be immune to this update. What do they have that others don't? A very diverse anchor text profile, that's what.

Spun contextual links within blog posts (fig.1) appear to still work from reviewing some website backlink profiles and although Google has been clamping down on the networks providing these links, is there anything else Google can do to identify these poor backlinks?

My prediction for 2014 is that the Google Hummingbird technology will be used in evaluating inline text links within spun blog posts. Hummingbird looks at the way a sentence is structured and the words used in a Google search query. So why is Google not using this technology in evaluating contextual links, to see if the words prior and after make sense or are as expected?

This can then be used in the fight against spam on networks that aren't as easy to find as ones on Blackhat World or Wicked Fire forums. This is what I hope to see implemented in 2014.

 

3. MOBILE SEARCH, HUMMINGBIRD & PERFORMANCE METRICS

 

 

Matthew Hayford, Head of SEO

 

We all know that Hummingbird was introduced to help bring back more relevant results for longer phrased search queries, including those conducted on the ever growing number of smartphone and tablet devices.

Whilst there probably won't be any significant change to the way Hummingbird functions, the close affiliation the algorithm has with mobile search will be refined further, to the point where I can see mobile search queries operating once again separately to desktop searches.

Alongside Hummingbird, response times for websites will become even more crucial for mobile search results. Google has championed responsive design sites for a number of years and Matt Cutts recently said that responsive design is the better way to go for SEO. This means that if you have an all-in-one website using responsive design you better make sure that it's optimised for its maximum performance, as we all know mobile SEO is significantly influenced by site speed.

Performance, coupled with Hummingbird, means that mobile SEO may once again have to be treated separately

from desktop SEO and our reliance on similar SERP results on desktop listings and mobile/tablet listings may start to differ.

 

4. THE YEAR OF CONTENT MEASUREMENT

 

 

Victoria Galloway, Lead Copywriter

Take a literary classic like Charles Dickens' Great Expectations and try to quantify not just how popular it is, but how it has resonated, permeated and filtered through multiple social histories. Tracking how popular it is might be considerably easier than tracking anything else; statistics surrounding sales and ebook downloads are relatively simple to get hold of, but how would you go about explaining the influence of Pip and Miss Havisham in the novel's wider context - numerically?

Explaining and rationalising online content faces exactly the same challenge. We can't just generate content anymore in the hope that it will simply do well. We need to ensure that what we're creating connects and prompts discourse because in 2014 the emphasis will be on generating quality conversations and tracking engagement.

Measurement lies at the heart of content marketing but ironically it's the hardest part of the process. Discussing the benefits of content marketing is easy, but justifying our efforts and quantifying them isn't. Whether subjectivity, taste and perception can ever truly be measured is yet to be proved.

Your target audience might click through onto your content piece but if they don't share it, like it or retweet it then how do we know how they processed it? Even if they did share it, did they genuinely find the content useful? It's a bit like measuring the success of an advertising campaign where you can only really measure profit.

2014 will be about finding ways of effectively measuring content's success. Knowing this is crucial when designing Penguin-friendly content plans and justifying more creative content. We'll need to think smart with all the metrics we can gather - from likes, shares, EAV (Equivalent Advertising Value), tweets and sales - and the true content champions will weave these into useful and penetrative narratives that will be able to predict what content should be produced, when and who for.

 

5. FULL USE OF PERSONAL DATA - RETURNING CUSTOMERS BECOME RANKING FACTOR

 

 

Britt Soeder, SEO Campaign Director

 

Google says in its Policies & Principles overview, that it improves services to all users by using data shared with Google, e.g. by using a Google Account. This goes from name and photo to hardware and operating systems as well as how you use their services. The latter is the interesting part.

Amongst others, Google knows:

- What brands or people you like and interact with

- What online shops you use

- What you like to read and what you dislike

Google officially says, that it uses this data to "provide, maintain, protect and improve […] and develop" its products and services. This of course includes a better search experience.

So since Google can officially uses our data, what are the main criteria it is looking at to determine how high a page should rank?

- Bounce rate - already used

- Time on site - already used

- G+ signals - already used

What do you do if you really like a site? You come back, don't you? You might not like sharing things on social networks but you will come back to a site you appreciate. Google Chrome bookmarks are certainly noted but not used to a great extent when it comes to rankings yet - and not all users want bookmarks because they think they share important data with Google (yes they do!) or they are not a sharing generation.

So in addition, could Google look at all returning traffic to a site, not only when bookmarks are used? Why would Google make the effort?

It would be logical because a website with lots of shares and links - both of which you can kind of influence despite the site's quality - shouldn't be worth ranking if visitors never return. Surely, if it was any good and worth the ranking position, visitors would come back. And it would be a fantastic strategy for retargeting and thus increasing Paid Media spend - a nice side effect to say the least.