Matt cutts announced on twitter that the latest iteration of the Penguin algorithm was pushed live yesterday. He states this will affect around 1% of results seen.
Back in May, Matt released this video telling us what to expect in summer 2013 – looks like the domain clustering update effect and Panda ‘softening’ have combined in Penguin 2.1:
Early investigations suggest that the penguin 2.1 update has helped clean search results from having hundreds of deep links from the same domain – known as the clustering effect.
The reduction of clustering of search results seems most noticeable in the property sector where sites such as RightMove and Zoopla previously dominated results on pages two onwards for queries such as:
“3 bed house for sale” – Page 1
“3 bed house for sale” – Page 2
Following Penguin 2.1, the SERPs contain a maximum of two adjacent results from the same domain. This is much better for users than before and suggests Google has tackled the issue of deep linked pages and their relevance to the query.
The UK property sites have thousands of ‘deep’ pages all potentially appropriate to the search for a 3 bedroomed house, but the latest Penguin update has helped to stop nearly every single one of these deeper pages from being shown (as was the case before).
This has introduced a new problem for searches looking for variety in their search results, however!
Notice that the ‘network’ of news sites shown above are all based around a common template. Each result has the same boilerplate title, and page layout, yet they are all considered non-spam and perfectly relevant results to return in response to the ambiguous original query.
This suggests that duplication of site theme and structure is not harmful to being returned in search – so long as the content is sufficiently different. In this case, each site is dedicated to a local region, and each site from the network is shown.