Google Penalty Recovery

What do you do when your Google organic traffic drops overnight? This story is a real world example of a drop and subsequent recovery.

The site, a main Stihl Dealer in Ireland, had been doing reasonably well for the previous year. There had been no changes to the coding of the site in well over 6 months. The owners realised that they were not getting as many enquiries as they had been. A look at Google analytics revealed this:

Google organic data

A 75% drop in organic traffic overnight. Every online businesses worst nightmare.

Cue a call to yours truly.

How to diagnose a traffic drop

Whenever I see a huge drop in traffic like that my first port of call, after asking if there have been any recent changes made to the site structure, is to cross reference it with the

Moz Google Algorithm Change History (click on image below to visit actual page).

It was clear that the drop in traffic occurred on 19th August.

Update History Screenshot

No relevant changes showing around the correct dates.

So there is no useful information there. Does that mean that there was no Google Penalty? We can’t rule out a penalty yet.

Next stop Google Webmaster Tools.

Google WMT screenshot

No manual penalty in place.

Here we can check to see if there has been a manual Google penalty administered. These are where sites have been reviewed and punished by an actual human being. They are often, but not always, preceded by warnings.

In our case here, there is no such action. While this may seem like a frustrating dead end in some respects, it is certainly not a bad thing!

What do we know now?

We know that whatever caused the drop in traffic was algorithmic but did not coincide with a major algorithm update from Google. A word of warning here though to site owners in Ireland and the UK. Our change dates don’t often exactly concur with what you will see in the Moz change history. That said, it’s still about the best resource available for changes.

There are over 500 changes made to the Google Algorithm every year, but only a few major ones. Of course if your site is negatively effected then a minor change isn’t minor to you. On top of this, updates do not roll out globally at the same time, and some only target specific geographic areas, or specific market verticals.

It happens that in this case I noticed some fluctuation in quite a few sites I manage on or around the same dates, but nothing that stuck like this site did. Of course if you just have one or two sites it’s not possible to notice general flux like that.

Back to Webmaster Tools.

The next place to look is in “Crawl -> Crawl Errors”. Bingo! 500+ 500 server errors showing. A 500 error means that something went wrong with the page load on the server site and, in short, the page is broken.

Webmaster tools will show you the problem pages, but not tell you why they are broken. In this case the main problem was 500 errors, but in your case you may have a large number of 404 errors (page not found), where you have links to pages that no longer exist on your site. Soft 404’s can also be a problem. This is usually where a page shows as “not found” (or no products available in this category etc) to a visitor, but returns a http server code of 200 (ok) instead of the correct 404.

Why is a http 500 error bad news?

Google wants to give its users a good experience. If I search for a Stihl Ms 391 chainsaw (to keep with the current example) then I want to find a reliable site. If I don’t then I think Google is crap and consider using Bing instead (ok, ok, so I’m only kidding about Bing, but you get the idea). Quality is important to Google to keep them at the top of their game. If a site has broken pages then the potential for a bad experience increases. It’s in Google’s interest to bury those sites….. and that’s just what’s happened here.

That Stihl Chainsaw used to rank at position 3 and fell to position 12 overnight and then further to position 36! This was repeated across the whole site for just about every product.

Diagnosing with Moz.

I like Moz. Not everything about it, and it’s not entirely free from glitches all the time, but it’s still a fine tool. It is a paid tool though, and at a minimum of $100 / month, you need to be able to justify it. If you don’t use, or are unlikely to use Moz feel free to skip this section.

Now that we know what we have been given a direction from Web Master Tools (WMT), can look at rankings, which confirms that individual pages have dropped and it wasn’t just that everybody stopped using the internet for a few days. More importantly we can look at the Crawl Diagnostics” section.

In this case it was showing the 500+ 500 server errors and also 6,700 302 redirects. A 302 redirect is a temporary redirect. There were also no 404 errors showing. Moz will show you which pages, but I prefer to use…..

Screaming Frog SEO Spider

This is one of my favourite tools. There is a free version, but I highly recommend spending the £99stg yearly licence fee for extra capability. Screaming frog allows you to crawl your entire site and filter for response codes (200, 301, 302, 404, 500 etc). It will also allow you see inbound links to problem pages and outbound links from it. Fantastic!

Screaming frog can do a lot more too, but that’s the relevant part for this particular job.

Analysis And Fixes

Using Screaming Frog it is easy to see which pages are causing the 500 errors. It turns out, in this case, that these are nearly entirely where a product has been removed. These pages should be returning a 404 (not found) page.

The site runs on OSCommerce. It also has a plugin to make “SEO friendly urls” called “Ultimate SEO URL’s”. These appear to be the problem, since turning off the plugin fixes the problem (confirmed by a re-crawl with Screaming Frog). I could have spent a lot of time working out why the plugin was causing problems (it may have been badly interacting with another plugin, have not been coded well in the first place or have been badly installed on this site). Too much work for too little reward! URL’s stuffed with keywords are a bad thing. Even nicely formed URL’s with the product name and no spammy elements do not significantly out-perform the OScommerce standard URL’s, which although messy looking, work just fine.

The analysis also shows that huge number of 302 redirects. These come from a number of places. Firstly, there is no default 404 for a removed product in OSCommerce, there is a 302 redirect to a “No product found” page, which returns a 200 instead. This is bad form. So I made no product found pages return a 404.

Next, trying to make a purchase redirects with a 302 to the login page. So does leaving a review. There is no need for search engines to even see those pages. Adding a nofollow to the buttons and a noindex tag to the login, shopping cart, leave a review pages etc fixes that issue. All of a sudden we go from 6,700 302 errors to 0!

The result

With zero 500 errors and zero 302 errors a look at analytics showed us this:

Google analytics graph

A complete recovery

As the site got re-crawled by Google it made a complete recovery. Actually, it improved slightly. There is still a bit more tuning to go, but the major problem has been resolved.

Key Points to Ponder

  • Getting on the wrong side of Google can happen at any time – not just when a major update is released.
  • Even if a site has been performing well, there may be problems with your site architecture that will catch up with you.
  • Site architecture means more than just the “filing system” you use on your site. It also applies to things like redirects, canonical urls, and of course major problems like 500 errors.
  • Google doesn’t generally drop / penalise a site without good reason. That reason may be hard to find, but it will be there.
  • Just because most of your site behaves perfectly well and looks good when you view it does not mean that it is safe. Most users of the site would never have seen a 500 page, or been aware that a 302 redirect happened when they went to make a purchase. The site worked. However it did not work correctly and there was the potential for people to land on pages directly from Google where there was a server error and see a blank screen. It used to be that “problem pages” on a site did not majorly effect the performance of the site as a whole. That is no longer the case. If you have weak areas of your site they have the potential to damage your entire site.

About Ian Wortley


Ian is one of Irelands leading SEO consultants with over 8 years experience in the Irish market. Ian discovered his love of all things internet related in 2005 and has been living nearly exclusively online since then, causing much worry to friends and family, but getting great results for clients. Contact Ian: Phone: 01 6854806 Mobile: 086 3817149

One thought on “Google Penalty Recovery

  1. Great Article, Website owners need to keep a close eye on this as their business could be wiped out over night. The webmaster tools manual action section is a must for everyone to look at, even if you are not a techie person but own an online business. My suggestion is for you to set up a webmaster profile and get your developer to add the verification code for you so you can keep a track of this. At the end of the day it is your business that will suffer.

Comments are closed.