Released on April 24th 2012, Penguin 2.0 has so far continued the work of the original Google Penguin algorithm in attempting to tackle web spam and poor linking practices. When Penguin was first launched last year, Google framed it as a way to punish sites that were engaged in ‘black hat’ SEO practices, from buying up links through web directories, to stuffing pages and meta tags with keywords.

Penguin 2.0 aims to build on this approach, and aligns with Google Panda in making it harder for sites to inflate their organic visibility without having quality content and organic linking from high quality sites to back up their position. What, then, do we know about Penguin 2.0 so far, and what can sites do to improve their ranking if they’ve been punished by the algorithm change?

What Penguin 2.0 Does

It’s perhaps best to start by looking to Google’s Matt Cutts, the chief engineer behind Penguin, to explain what some of the major features of version 2.0 are. As with the original Penguin, 2.0 is all about tackling web spam, but goes into more detail over what counts as links from low quality sites. Google have also updated their Panda algorithm, which is now in a state of ‘everflux’ so gaming Google is now harder than ever.

Cutts suggests that Penguin 2.0 will affect about 2.3 per cent of English and US search queries, and will also be rolled out to include other languages. The algorithm will work to promote good SEO practices from webmasters, which means generating relevant, well written content on sites, and building up a search profile based on links from high quality sites, rather than from pages that have been identified as participating in spam or openly selling links.

Some of the areas that Penguin 2.0 will be particularly tough on include advertorials, whereby sites pay to have content promoting their brands appear on other sites. Pages that don’t include ‘nofollow’ tags on links, and that don’t clearly state that the promotional content is not editorial, will be punished. Cutts gives the example of payday loan queries in the UK as being particularly problematic for this kind of approach, and lays out how Google are trying to make it easier to detect where advertorials are being used in the wrong ways.

Other areas that are being tackled by Penguin 2.0 include sites that may have been hacked and are unknowingly selling links. Thankfully Webmaster Tools have released an update which notifies webmasters that have been hit by potential malware or have been hacked.

Ultimately, Penguin 2.0 is all about promoting good content and achieving organic backlinks – areas that have already been a focus for SEO agencies since 2010 like guest blogging, author biographies, and social bookmarking will become even more important. Lingering ‘black hat’ SEO tactics like using irrelevant or organic keywords in content, or not cleaning up backlinks from low quality sites will also become harder to sustain if Penguin recognises that your links are coming from a less than reputable source.

Sites will have to become better at compiling content through good internal linking, and by becoming better authorities on certain subjects; this means not accepting guest posts that are only loosely linked to the subject of a page, while also being more careful about how content is shared on social media. What this is going to mean, in practice, is an even greater emphasis on quality content over flooding search results with low quality material.

In terms of data clustering and the same domain results appearing over and over again, Penguin 2.0 and the Host Crowding update aimed to clean up search results from the same domains. The Conductor has demonstrated how one domain – Bed Bath and Beyond – has gone from having 82 of the top 100 results for specific searches, through to 7 of the top 100, as the result of host crowding update reducing the importance of individual product pages. Webmasters will have to be more careful about assigning specific category level URLs to product pages, so that these appear in the top pages of search results as valuable results, rather than content being spread over multiple pages.

Of the sites already hit by Penguin 2.0 in May and June, the majority have been ones that have been too reliant on exact match keywords and keyword stuffing, or that haven’t tightened up on their backlink profiles – more examples can be found here, but the results arguably show that some sites still haven’t learned their lessons in terms of cleaning up their backlinks, and will likely experience more damage to their Page Rank.

What to Do

As with Penguin 1.0 last year, the most important thing to do is not panic when it comes to another algorithm update; as Matt Cutts and others have been keen to emphasise, webmasters that are producing good content, and that are getting organic links and keeping on top of their backlink directory will generally not notice a change. More pressure will exist, though, to make sites even more focused on boosting social signals and getting a core of high quality links from trusted sites.

Jayson DeMers has recently recommended re-evaluating the quality of your backlinks in as much detail as possible, while making sure that site URLs and meta tags contain relevant variations on your high scoring keywords. Disavowing links should also be a regular part of an SEO review, with an emphasis on resubmitting sites to Google until it’s become clear that you’re no longer being punished for suspicious links.

What’s going to become more and more important over the next few months is finding ways to improve the authority of your site within a specific niche; this means looking at everything from blog content to whether or not your landing and product pages are properly indexed and set up with internal links. Contextual and thematic links from other pages will also have to be checked to ensure that there’s a natural relationship between you and another site, rather than it appearing like you’ve paid or exchanged links.

Making use of social networks and creating a genuine interest in your product will continue to be crucial. Don’t neglect Google Plus, as it only makes sense to satisfy Google by starting to build up your profile on its own network, even if it might not make immediate sense in terms of the amount of users it has compared to other networks. Similarly, you need to identify ways to strengthen your local links, which means working harder to build relationships with other brands and news sites that can offer you organic, high value links.

In this context, Penguin 2.0 doesn’t represent a radical change from the version that webmasters have been dealing with since 2012, but it does reinforce how seriously Google is taking web spam. Trying to hang on to tactics like paying SEO agencies to generate thousands of backlinks on low quality sites and directories shouldn’t even been considered. While it’s going to be harder to stand out and take shortcuts with Page Rank, building your site’s visibility around Google’s guidelines is going to be unavoidable.