Tag Archives: linkbuilding

Search Intent: Is the arms race for links finally over ?


Despite mounting evidence, businesses of all sizes cling to an outdated understanding of search engines, in particular there is a strongly held belief about how Google decides which sites to present in search results.

Over the past few years, Google has been placing more weight (!) on machine learning than on simpler measures such as Page rank, yet a fuller appreciation of how this affects positions in search results and subsequent traffic flows is not being explained by the industry. Even when we know what we are doing has dubious benefit at best. Our industry continues to mislead business by promising to ‘build links’, seek out ‘high authorty domains’ and generally waste clients budget by engaging in out-dated practices that are highly unlikely to benefit the client in the long term. I am not referring to spammy automation, we have all moved on from those days, I am referring to continuing to ‘build’ links at all.

Are links really that important in 2013?

The most prevalent belief is that sites are ‘rewarded’ for simply having more links from high authority sites than their competitors. This belief is based in turn on a belief that links correlate to rank, and that rank can be somehow ‘cemented’ by building links.  Consequently, agencies respond to competitive RFPs by agreeing to build links rather than challenging the clients firmly held beliefs.

This post will examine one page of results for a query, and I’ll attempt to show how far Google has come from simply matching keywords within the query to on-page content, then using links as a measure of trust to determine rank…

For this discussion, I will use the 3 term search  ‘list pax man’ using Google.co.uk

SERPs ordering for ‘list pax man’ query

Notice that Google suggests ‘pax man’ may be a mistake, I am presented ‘paxman’ as a single keyword suggestion.

Notice also that Google includes a link to Wikipedia, for a list of ‘Pac-Man’ video games – even though I did not type Pac-Man.

The first result is the most relevant to my query:

pax is a unix command, man refers to the manual and list is a common option

Both second and third results relate to Jeremy Paxman.

The second result contains the word ‘list’ once on the page (Amazon wish list).

The third result does not contain the word ‘list’ at all.

The fourth result shows how hard Google is working to provide all the possible options for my initial query.

Now let’s look at the number of inbound links for each result… if links were an important factor in the results set, then it would follow that the result suggested in #1 would have a handsome quantity of inbound links.

The stats (from MajesticSEO) for each URL in the SERPs are:

#1: 21 referring domains, 36 backlinks

#2: 1 referring domain, 4 backlinks

#3: 723 referring domains, 5,789 backlinks

#4: 25 referring domains, 156 backlinks

OK, so what ? Perhaps the ordering of the SERPs still has only something to do with links, but the quantity and volume are less important than the relevance of the page content…

Let’s make it more obvious what we are searching for, by repeating the ‘pax’ keyword in our query:

SERPs for the query ‘list pax pax man’

Now Google understands. We have achieved ‘entity disambiguation’ – a search engine nirvana.

We don’t mean ‘Pac-Man’ and we don’t mean ‘Paxman’. The repetition of the keyword ‘pax’ has ensured a full page of results about the Unix command ‘pax’.

Notice that the previous ‘best result’ for the Unix command has gone from #1

A URL from the apple development site takes its place.

In fact, the top results are now dominated by well frequented sites such as Apple, sourceforge, IBM and Wikipedia. The domain ‘manpagez.com’ is not a competitor in this new well-defined search space – which is odd, as it used to be the best possible match, just one query ago!

Looking at links to pages:

#1: 6,994 referring domains, 91,196 backlinks

#2: 64 referring domain, 156 backlinks

#3: no links reported

#4: 28 referring domains, 154 backlinks

If Google had a more ‘authoritiative’ match for the original query – if we assume links convey power and authority, then why wasn’t the result from Apple presented in that first search test ?

Look again at result #3 above from IBM. There are NO back links reported to this page by MajesticSEOs historic index.

The results above reflect the ‘importance’ of the results in relation to the query based on machine learning – not simply accumulations of links.

In order to be presented in SERPs, considerable effort must be made in ensuring that the clients site serves the need of the audience.

People looking for man pages on the pax command are the sort of people that visit developer.apple.com. Once Google has clear signals that the query is for the Unix command and not ‘Pac-Man’ or ‘Paxman’, then the SERPs are populated with the domains that serve this audience best. Unfortunately for manpagez.com, they do not attract and retain attention sufficiently to be returned for the second search.

I encourage you to try this for yourself and further to demonstrate to clients the dissociation of links with position in SERPs for queries. There is simply little to be gained by adding links to a site – the most important challenge is to give visitors what they want on the site and be clear about the content offered.

Growing an audience that uses the site regularly is more important than just adding links. Work on building advocates and the links will follow to support that effort. If you work on building links, then you are only increasing risk for the future.

Let me know what you think matters most by commenting below 😉


Enhanced by Zemanta

Google disavow tool for links released

At last, Google has released a disavow links tool. This will really help to clean up years of linkbuilding abuse that gave businesses unfair ranking advantages or a while, until Google Penguin was released.

The disavow tool is available at:


and the news of the release can be found here:


Google disavow links tool
Google disavow links tool

When selecting the ‘Disavow links’ option, you are given a warning page:

Disavow links warning
Disavow links warning

Proceeding with the option, and with trepidation ignoring this warning from Google that it is an advanced feature, we get an upload lightbox:

Upload disavow links file
Upload disavow links file

Couldn’t resist an image test on the way 😉

Upload disavowed links in google
Upload disavowed links in google


The text file must be smaller than 2Mb with one URL per line. Comments can be included in the file if prefixed with a #.

# Sitewide links impossible to remove
# Links still remain on these pages


Where “domain:” tells Google to disavow links from all pages on a the “blognetwork.com” site.
Specific pages can be disavowed by listing the complete URLs.

Only one disavow file per site is allowed and it is shared among site owners in Webmaster Tools.

If you want to update the disavow file, you must download the existing disavow file, modify it, and re-upload it.

Don’t expect overnight results, Google suggests that it may take up to 2 weeks for the submitted URLs to be re-crawled before the disavowed links have any impact. Also, a manual reconsideration request may still be necessary, after allowing a period of time to pass following disavow links file upload.

So this tool may not fix everything, but at least it helps with backlink cleanup following the Penguin algorithm release and updates.

This is a major step forward in promoting best practice by Google and I hope the tool doesn’t break !

Rank improvement by changing URLs

Back in December 2011, almost one year ago, friendly URLs were introduced in order to improve CTR. The old parameterized URLs were indexed, but even when ranking highly, the CTR was far from impressive.

The switch to URLs with clear and appropriate keywords to provide a logical structure has helped encourage additional visitors to click, and there has been a corresponding increase in rank. Of course, whether this is a causal relationship is up for debate – but either way, the statistics are interesting…

The following graph shows the steadily improving position in SERPs for the friendly URL since their introduction at the start of the year:

Rank improvement from friendly URLsThe chart shows a starting rank of around position 50 in Google back when the URLs were changed. As of last week, the URL is in the top 20 and has featured in 5th position regularly in recent weeks.

This positional change is not due to link building to the page:

No new backlinksThe backlink chart above suggests that no new links are being built in easily discoverable parts of the internet. The total number of backlinks reported has remained zero throughout the period of the rank improvement.

Similar improvement in rank is seen across other friendly URLs.

One explanation is that Google tests URLs for CTR. Those exhibiting favourable metrics e.g. low bounce rate, time on site before returning to search again could be used to indicate results that are the most valuable to search. Those results should be promoted to better positions. This promotion in search results takes time.

Even before Penguin, there have been alternative approaches (beyond simply building keyword rich anchor links) to improve rank. Improving the appearance of snippets to increase CTR is one of those techniques.


Is link building dead ?

The appearance of fresh links may provide a useful signal to Google – to test a URL higher up in the SERPs – but if that URL doesn’t engage the visitor, then other metrics will be poor and the URL is more likely to slip in the ranks than climb.

Heavy linkbuilding may simply hasten the decision to demote a poor performing site, based on the engagement metrics of the URL targeted.

What do you think?

Please leave a comment below.



Enhanced by Zemanta