Category Archives: Expert SEO

Quality Issues and Content Hijacking Messages

SEOs around the world have been following DEJANSEO’s recent hijacking experiments. Some may have been tempted to try the content scraping technique for themselves, perhaps with a meta refresh or javascript location switch, in order to send a visitor to the preferred target. Hijacking has been around for years, however this technique exposed interesting side benefits and insights available through webmaster tools:

However, it seems Google has caught up with the content scraper hijack today (Saturday 17th November 2012):

and issued DejanSEO with a warning message.

Quality issues message from Google Webmaster tools

The red circle highlights the crux of DejanSEO’s hijacking test. Content from the target was copied verbatim to a new domain, then that new domain was given an authority link (in order to give it higher page rank than the source) and thereby displace the original source in SERPs. The really interesting effect of the hijacking was that it also gave visibility (through Webmaster Tools) of the backlinks to the original source page – very handy as an additional source alongside MajesticSEO, ahrefs, Blekko et al.

Quality Issues Message in Webmaster Tools – Copied Content

On average (according to Matt Cutts), only about 10 sites a day are sent unnatural links warnings (July 27th 2012). So I’d suspect even fewer can expect to receive the low quality issue message specifically mentioning copied content. It may be that DejanSEO raised awareness to the extent that he has received a personal message from Google…

Please comment on this post if you have seen or received a message worded in exactly the same way from the Google Search Quality Team as the one shown above. It’d be interesting to gauge the extent of these messages.

Quality Messages from Google – Hidden Text

The evolution of warning messages is fascinating. When hidden text was the ranking technique of choice up until July 2007, Google’s first messages to be delivered in Wemaster Tools looked like this:

Quality issues message for hidden text in 2007
Back in 2007 – Hidden text warnings

Notice that Google is very open about the impact of using hidden text – stating removal from search results for at least 30 days. Google is less open now about the penalties applied to sites and pages found in violation of the guidelines. So SEOs must use knowledge and experience.

Enhanced by Zemanta

Search intent: Tom Cruise Height 1.70m

Google takes your search term and works hard to discover the underlying intent behind the search. Bringing to bear the vast historic and current search landscape as well as trending queries, your search term is interpreted very early on in the decision process. The results ‘most likely to match your intent’ are then presented in SERPs.

This is not really news to the SEO community, but we often choose to ‘skip over’ the deeper implications and what this means to how we optimise sites and advise our clients.

In a recent interview with Emily Moxley, an example was given for how a short tail search term was augmented in order to ensure the ‘most likely’ search intent was represented in the results:


Emily explains that searches for ‘Tom Cruise’ often include a reference to Tom’s height. Google decides that all searches for ‘Tom Cruise’ are likely to be related to Tom’s height, so should therefore include results that confirm Tom Cruises height is 1.70m – even though the original search didn’t include the height request. Google knows that in the majority of cases, anyone looking for Tom Cruise wants to know how tall he his.

Therefore, Universal SERPs are made up of variety of ‘best guesses’ at the meaning or intent behind the search. The results are not simply a mix of news, videos and images related to the query term. The results are a mix carefully selected to match the search intent behind the term.

Try the search intent tool to gain a more complete understanding of the intention of the searcher in Google.

For example, a search at the time of writing for the term ‘sandy’ will produce a page of results relating to the hurricane Sandy. It is assumed that the intent behind the query is to discover the latest news on the hurricane rather than the definition of the word ‘sandy’ or a search about beaches.

Sandy search from Carlisle

Notice that the knowledge graph result is for a local businessman ‘Warwick Sandy’, living in Longtown, Carlisle.

The same search from Inverness produces a Google+ page for a local photographer called Fea Sandy as well as the hurricane related results:

Sandy search from Inverness

The same search ‘sandy’ from London produces a different map and a different set of results (including the Guardian in second place):

Sandy search from London

If the location is changed to ‘Bedford’, then the search results for the term ‘sandy’ include the wiki entry for Sandy the small town in Bedfordshire:

Sandy search from Bedfordshire

It is clear that the intent behind the query is determined from multiple sources:

  • Trending and historic search terms related to query e.g. sandy storm, sandy hurricane, tom cruise height
  • Search location to produce relevant results e.g. Local business, or local information
  • Previous queries and sites visited ie. search history

Typically, Google ensures that results include associated terms when the original intent is not clear.

More than ever before, it is important to understand the intent behind queries, then ensure your site content satisfies that intent. Google will prefer to show sites that best match the intent behind the query, not simply match the query to the title or on-page content. Matching keywords to content isn’t enough anymore.

Matching the intent behind the query to on-site content is more likely to produce results as Google gets better at determining what the query actually means rather than simply matching the key words themselves.

So if you are trying to rank well for ‘Tom Cruise’ then be sure you mention his height…


Enhanced by Zemanta

Google disavow tool for links released

At last, Google has released a disavow links tool. This will really help to clean up years of linkbuilding abuse that gave businesses unfair ranking advantages or a while, until Google Penguin was released.

The disavow tool is available at:

and the news of the release can be found here:

Google disavow links tool
Google disavow links tool

When selecting the ‘Disavow links’ option, you are given a warning page:

Disavow links warning
Disavow links warning

Proceeding with the option, and with trepidation ignoring this warning from Google that it is an advanced feature, we get an upload lightbox:

Upload disavow links file
Upload disavow links file

Couldn’t resist an image test on the way 😉

Upload disavowed links in google
Upload disavowed links in google


The text file must be smaller than 2Mb with one URL per line. Comments can be included in the file if prefixed with a #.

# Sitewide links impossible to remove
# Links still remain on these pages


Where “domain:” tells Google to disavow links from all pages on a the “” site.
Specific pages can be disavowed by listing the complete URLs.

Only one disavow file per site is allowed and it is shared among site owners in Webmaster Tools.

If you want to update the disavow file, you must download the existing disavow file, modify it, and re-upload it.

Don’t expect overnight results, Google suggests that it may take up to 2 weeks for the submitted URLs to be re-crawled before the disavowed links have any impact. Also, a manual reconsideration request may still be necessary, after allowing a period of time to pass following disavow links file upload.

So this tool may not fix everything, but at least it helps with backlink cleanup following the Penguin algorithm release and updates.

This is a major step forward in promoting best practice by Google and I hope the tool doesn’t break !