Down to business, the recent updates have been a win for Google as many blackhat sites were penalized. Google’s continued dedication to penalizing sites they believe are violating their TOS is starting to payoff if you discount the huge collateral damage. It was quite a sight to observe such widespread damage, I’ve not seen an update this hard hitting before, it was the web equivalent of the atomic bomb.
So why am I so excited?
Well, I’ll let Jeff Goldblum explain…
In many parts of the world including California, there are Pine trees such as the Monterey Pine. The Pines have evolved to not only cope with forest fires, but to thrive and even rely on forest fires to spread seeds and germinate. A forest fire which seemingly wipes out life within an area has now become an opportunity to grow and multiply.
Like the Monterey Pine during a forest fire, the Google landscape changes by destroying the authority sites but brings new opportunity to grow new and better evolved websites which can adapt to the new environment. Forgive my lecture on evolution, what I’m trying to say is that there is a lot of very open SERPs to be dominated right now and it has never been as uncompetitive.
The space available in which to grow your new website is immense and those able to adapt to change will find Google’s updates extremely profitable.
My thoughts and finding on the update, ready?
Google is a sly dog, they know we’re watching, cross-referencing and looking for correlations between sites that were hit and differences in sites that survive their updates. Google has a history of not disclosing details and as such I believe they often release more than one update on the same day in order to make it much harder for us to understand the update. I think this is why we often can’t see similarities between some sites hit by a Google update, because they were hit by a different spam-filter which was released at the same time.
As always we can only speculate on observations and gut feelings, but if I was to place a bet on how things have gone down, these are some of the directions I think Google has gone in with this update. I’m not saying all of these observations are correct, but I’m certain some of them will be.
But first, to point you guys in the right direction lets look at the history of Penguin
Penguin works on a page by page basis, if you’ve been hit by penguin you may notice that only part of your rankings were hit. If you separate your keywords by landing page you’ll likely notice that if it was Penguin, your site was hit only on specific pages, other pages will have rankings intact. Penguin 1 only targeted your homepage and the backlinks pointing to it, Penguin 2.0 dove deeper and started to look at your inner pages too, then back in May Google released this video in which Matt Cutts talks about going “upstream” and examining even further into the backlink profile, what he is talking about is tiered linkbuilding.
Tiered linkbuilding is the next logical step
One of the most prolific linking techniques right now is tiered linkbuilding and blog networks. Many blog networks are supported by tiered linkbuilding and by taking Penguin that step further to go deeper into backlink profiles we can see how Penguin could have such a devastating effect on many sites. This isn’t to say tiered linkbuilding is dead, I’m a firm believer in testing and actually trying it before making any conclusions, but my gut tells me that you’ll need squeaky clean tiered linkbuilding to be effective.
The disavow database is now active
Another hit on tiered linkbuilding and automated webspam, I believe Google has action-ed the disavow database they’ve been collecting. This database will consist of all those domains on your GSA, Ultimate Demon and Scrapebox list you’ve been spamming too. All of those lists and sites are now likely devalued to a large degree or even penalized for if you have a high number linking to your site. Personally, I won’t be using automated spam tools anymore, partly because I feel they hurt my sites and partly because there are now better way to achieve rankings.
Anchor text is still a key area for Penguin
Many of my sites were hit, however many of them were from my automated spam tool days where for many of the site I hadn’t managed anchor text properly, so I knew something would catch up with them eventually. However, a few were spared and after digging into them I found that one of the most glaring differences was how well I’d worked the anchor text. One of the sites which increased in ranking by a large amount after Penguin only had 6% anchor text which targeted my main keyword, 34% was Other and most of the anchor text was non-related to my keywords or variations of the raw URL. It was the only domain that this much diversity, which brings me onto my next section.
Partial match anchor text
Moz recently published this article on ranking factors for 2013, unsurprisingly linking root domains with partial match anchor text was quite high up on the list. Google seems to have turned up the sensitivity a notch on Penguin and I still believe anchor text is playing a role in penalizing sites. From now on your anchor text needs to be squeaky clean.
Hummingbird affected rankings as well as Penguin
I also noticed that while some sites passed Penguin, they had already been hit when Hummingbird was released. Hummingbird expanded Google’s knowledgegraph and understanding of queries, it seems logical this had an affect on how Google’s algorithm understands websites and webspam. Hummingbird in combination with Penguin seems like a great one two combo for identifying spam extremely well. While Hummingbird is not a spam filter, it makes sense that it will still affect sites ranking.
Link velocity seems to be a big factor
The most obvious and glaring factor I’m currently observing is how sites with very few links and low link velocity are now ranking in the top 10 across all niches including the most competitive. It seems to be that gaining 20 powerful links would get you ranked for keywords that would have needed 30,000 links just a few years ago. So slow and steady powerful natural looking links are key to top rankings in the current environment
An attack on blog networks and SAPE
Google likely also had their manual raters out in force to identify blog networks and SAPE sites around the web in preparation for Penguin 2.1’s release date. This type of manual work by the webspam team seems to be on the increase as more and more highly competitive niches show signs of being manually filtered for blackhat sites.
This update overall looks like a quadruple whammy of factors
…made to leave a brown stain in your pants. But as I said before, I’m not worried or concerned. If you gave me the option of today’s environment and then offered me what Google was when you needed 100,000 links to rank highly, I’ll always choose what we have now. It sucks if you have SEO clients and authority sites which have now been destroyed but think of it this way, you now have the opportunity to build something that can capitalize on your competitors mistakes and open yourself up to new possibilities. Oh and last but not least, this is where blackhat thrives. Nothing ranks faster than a blackhat site when authority sites have been destroyed, so you have the opportunity to rank and bank very easily right right now.
Stay tuned for an upcoming post on methods of ranking post Penguin 2.1
I’ll be covering how I be approaching linkbuilding for authority sites in an upcoming blog post aimed and helping you settle in to this new environment. I’d also like to say, I’m not claiming to know the secret to Penguin 2.1, but what I will claim is that there are definite kinks in Penguin’s armor and by simply playing smartly you have a very good chance achieving and exceeding rankings you may have once had.