The Google webmaster blog contains the following quote:
If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site’s backlinks and disavow links that are the result of link schemes that violate Google’s guidelines.
That’s a little vague, isn’t it? If you *believe* your site was affected because of spammy or low quality links then sure, give the tool a try. It doesn’t give us a lot of information on how effective this tactic would be.
I recently came across this video of a Google Webmaster hangout held on October 19, 2012 by John Mueller. Starting at the 12:30 mark in the video, John answers a question about whether or not the disavow tool could be used to help a site affected by Penguin. (I have transcribed it below):
Question:…Is the disavow tool good to remove [a] Penguin penalty from a domain?
John Mueller: So, generally speaking, the Penguin algorithm is a webspam algorithm that can also take into account links like that, so if you use the disavow tool that would be similar to adding a nofollow to those links which would take those out of the way that we process those links from an algorithmic point of view. So, with that in mind if the Penguin algorithm has been picking up those links for your site and kind of using those appropriately, then using the disavow tool will take those out of the use by the Penguin algorithm.
So, that should work.
But again, this is something that is a little bit, I guess, advanced, where you really have to watch what you are doing there. Most websites shouldn’t need to use this tool and you should also keep in mind that it takes quite a bit of time for this to update. So on the one hand we have to recrawl those links and on the other hand we have to update the Penguin algorithm data as well…so, it’s something where you wouldn’t see a change from one day to the next.
Here is what I thought was interesting after watching this hangout video:
- The Penguin algorithm takes into account links. This is not new information as we all know that Penguin affects primarily self-made links with gross abuse of keywords as anchor text. But, this statement from John implies that there are several other factors that contribute to Penguin as well.
- Disavowing the bad links to a Penguin affected site should help a site to recover.
- A site will not recover until Penguin runs again.
If it is true that removing bad links to a site can cause it to escape Penguin, then I wonder why we did not credible case reports of Penguin recovery after the last refresh? Surely there are webmasters out there who managed to get the majority of their spammy links removed. Perhaps you need to remove ALL of your spammy backlinks and no one did that? Perhaps those sites that had bad backlinks were previously ranking on the power of those links and will never see a return to that ranking now that those links are removed/disavowed? But surely there have to be some sites out there that had a mixture of good links and “penguinized” links that would have seen some sort of recovery after the refresh?
This makes me wonder what the other factors are in the Penguin algorithm? What do you think?
Penguin photo by wwarby, flickr.