What We Know About Penguin 2.0 So Far

Released on April 24th 2012, Penguin 2.0 has so far continued the work of the original Google Penguin algorithm in attempting to tackle web spam and poor linking practices. When Penguin was first launched last year, Google framed it as a way to punish sites that were engaged in ‘black hat’ SEO practices, from buying up links through web directories, to stuffing pages and meta tags with keywords.

Penguin 2.0 aims to build on this approach, and aligns with Google Panda in making it harder for sites to inflate their organic visibility without having quality content and organic linking from high quality sites to back up their position. What, then, do we know about Penguin 2.0 so far, and what can sites do to improve their ranking if they’ve been punished by the algorithm change?

What Penguin 2.0 Does

It’s perhaps best to start by looking to Google’s Matt Cutts, the chief engineer behind Penguin, to explain what some of the major features of version 2.0 are. As with the original Penguin, 2.0 is all about tackling web spam, but goes into more detail over what counts as links from low quality sites. Google have also updated their Panda algorithm, which is now in a state of ‘everflux’ so gaming Google is now harder than ever.

Cutts suggests that Penguin 2.0 will affect about 2.3 per cent of English and US search queries, and will also be rolled out to include other languages. The algorithm will work to promote good SEO practices from webmasters, which means generating relevant, well written content on sites, and building up a search profile based on links from high quality sites, rather than from pages that have been identified as participating in spam or openly selling links.

Some of the areas that Penguin 2.0 will be particularly tough on include advertorials, whereby sites pay to have content promoting their brands appear on other sites. Pages that don’t include ‘nofollow’ tags on links, and that don’t clearly state that the promotional content is not editorial, will be punished. Cutts gives the example of payday loan queries in the UK as being particularly problematic for this kind of approach, and lays out how Google are trying to make it easier to detect where advertorials are being used in the wrong ways.

Other areas that are being tackled by Penguin 2.0 include sites that may have been hacked and are unknowingly selling links. Thankfully Webmaster Tools have released an update which notifies webmasters that have been hit by potential malware or have been hacked.

Ultimately, Penguin 2.0 is all about promoting good content and achieving organic backlinks – areas that have already been a focus for SEO agencies since 2010 like guest blogging, author biographies, and social bookmarking will become even more important. Lingering ‘black hat’ SEO tactics like using irrelevant or organic keywords in content, or not cleaning up backlinks from low quality sites will also become harder to sustain if Penguin recognises that your links are coming from a less than reputable source.

Sites will have to become better at compiling content through good internal linking, and by becoming better authorities on certain subjects; this means not accepting guest posts that are only loosely linked to the subject of a page, while also being more careful about how content is shared on social media. What this is going to mean, in practice, is an even greater emphasis on quality content over flooding search results with low quality material.

In terms of data clustering and the same domain results appearing over and over again, Penguin 2.0 and the Host Crowding update aimed to clean up search results from the same domains. The Conductor has demonstrated how one domain – Bed Bath and Beyond – has gone from having 82 of the top 100 results for specific searches, through to 7 of the top 100, as the result of host crowding update reducing the importance of individual product pages. Webmasters will have to be more careful about assigning specific category level URLs to product pages, so that these appear in the top pages of search results as valuable results, rather than content being spread over multiple pages.

Of the sites already hit by Penguin 2.0 in May and June, the majority have been ones that have been too reliant on exact match keywords and keyword stuffing, or that haven’t tightened up on their backlink profiles – more examples can be found here, but the results arguably show that some sites still haven’t learned their lessons in terms of cleaning up their backlinks, and will likely experience more damage to their Page Rank.

What to Do

As with Penguin 1.0 last year, the most important thing to do is not panic when it comes to another algorithm update; as Matt Cutts and others have been keen to emphasise, webmasters that are producing good content, and that are getting organic links and keeping on top of their backlink directory will generally not notice a change. More pressure will exist, though, to make sites even more focused on boosting social signals and getting a core of high quality links from trusted sites.

Jayson DeMers has recently recommended re-evaluating the quality of your backlinks in as much detail as possible, while making sure that site URLs and meta tags contain relevant variations on your high scoring keywords. Disavowing links should also be a regular part of an SEO review, with an emphasis on resubmitting sites to Google until it’s become clear that you’re no longer being punished for suspicious links.

What’s going to become more and more important over the next few months is finding ways to improve the authority of your site within a specific niche; this means looking at everything from blog content to whether or not your landing and product pages are properly indexed and set up with internal links. Contextual and thematic links from other pages will also have to be checked to ensure that there’s a natural relationship between you and another site, rather than it appearing like you’ve paid or exchanged links.

Making use of social networks and creating a genuine interest in your product will continue to be crucial. Don’t neglect Google Plus, as it only makes sense to satisfy Google by starting to build up your profile on its own network, even if it might not make immediate sense in terms of the amount of users it has compared to other networks. Similarly, you need to identify ways to strengthen your local links, which means working harder to build relationships with other brands and news sites that can offer you organic, high value links.

In this context, Penguin 2.0 doesn’t represent a radical change from the version that webmasters have been dealing with since 2012, but it does reinforce how seriously Google is taking web spam. Trying to hang on to tactics like paying SEO agencies to generate thousands of backlinks on low quality sites and directories shouldn’t even been considered. While it’s going to be harder to stand out and take shortcuts with Page Rank, building your site’s visibility around Google’s guidelines is going to be unavoidable.

Google Releases “Help for Hacked Sites” to Combat Compromised Websites

In an attempt to reduce the number of hacked or compromised sites, Google has released an online tutorial series dubbed “Webmasters help for hacked sites”.

There are 8 different steps in the recovery process, each of which is targeted towards a different level of user. If you are unfamiliar with any of the steps outlined in the guide, then Google recommends contacting your hosting provider or support team to let them handle the issue.


Even the most tech savvy webmasters occasionally fall prey to vulnerabilities in popular scripts. This is particularly true of CMS platforms like WordPress or Joomla, with one of more memorable issues in recent history being the heavily exploited vulnerability within TimThumb.

If you’ve recently seen instances within Google’s SERPS of any of the following messages for your website:

  • This site may be compromised
  • This site may harm your computer
  • Reported attack page!
  • Phishing (web forgery) suspected
  • Notice of suspected hacking

Then your website may be compromised.


Head on over to http://www.google.com/webmasters/hacked/ to learn more about the removal process or start by watching the video below.

Does Guest Blogging Really Work? – A Live Case Study

Now that Guest Blogging has become an SEO’s weapon of choice, there has been a substantial increase in the level of outreach that is currently being undertaken on a day to day basis (and subsequently a decrease in the quality of outreach and content) but the question at hand today is, does guest blogging really work and if done corrently how effective is it?

The Challenge & The Benchmark

Starting from the 7th of January 2012 I will attempt to write 30 guest posts over the course of 60-90 days (although 20 will probably be my limit) and have them all published by the 7th of April. All content will be unique and anchor texts will be diversified. I will be completely transparent with regards to all the posts and they will be available to read below.

Continue reading “Does Guest Blogging Really Work? – A Live Case Study”

Deciphering The Google Reconsideration Request Responses

For those that feel that they may have been hit by a penalty, Matt Cutts has released a video explaining the various responses that you can expect to receive. In the wake of the numerous Panda/Penguin updates, Google also released the disavow links tool (16th October 2012). As a result, the response time for a recon-request may have increased while webmasters purge their low-quality backlinks and disavow links they were unable to move.

Continue reading “Deciphering The Google Reconsideration Request Responses”

EMD, Panda & Penguin – The “Trilogy of Search Terror”

Over the past few weeks Google have released an onslaught of new updates dedicated to reducing spam and increasing the quality of their organic results. As if releasing the EMD and Panda updates back to back wasn’t bad enough, Google decided to release their latest data refresh of Penguin yesterday (5th October 2012). Today I’m going to be covering what these new updates mean for you and discussing the close proximity of all three updates dubbed “The Trilogy of Search Terror” by Search Engine Watch.

The Timeline

  • Google Panda “20” – 27th September 2012
  • The EMD Update – 28th September 2012
  • Google Penguin 3 – 5th October 2012

Continue reading “EMD, Panda & Penguin – The “Trilogy of Search Terror””