What would you say if I told you that your current salary is going to be cut in half. What if I didn’t even bother telling you why?
Well, that’s exactly what happens when website owners get hit with one Google search algorithm update or another. You’ll be rolling along with wonderful, organic search traffic for months, and then suddenly, in a single day, your incoming traffic from Google drops by nearly fifty percent. That income you’ve been getting used to – the advertising revenue that’s been helping you put food on the table and pay the bills? That gets cut by nearly as much. You feel doomed and helpless, with nowhere to turn.
You certainly can’t ask Google, because Google isn’t a person. In fact there isn’t even a help line or tech support. Google essentially creates and modifies its search engine algorithm however it likes, and using whatever rules or bright ideas the algorithm teams come up with to catch those nasty spammers. Unfortunately, more often than not, they catch well-meaning, hard-working website owners in the process. It’s a tragedy, but it happens. And the odds are good that since Google’s search engine remains the biggest source of traffic for most websites out there — this is likely going to keep happening for the foreseeable future.
So, what do you do when you’ve been hit by an algorithm update, be it Panda, Hummingbird, or some future animal of the search engine variety? In this article, I’m going to show you how to do a quick investigation of your search traffic to quickly identify the problem areas of your site in Google’s eyes, so that you can quickly get to work fixing whatever may be wrong with those pages, and recovering your traffic as soon as possible.
Investigating Traffic Drops With Google Analytics
It’s that sickening feeling when the traffic falls out from under you. You thought you were doing everything right, but then out of the blue you lose 20-30% of your traffic without warning. No one wants to check their Analytics account to find something like this waiting for them.
However, if you’ve owned a website long enough, then you’ve experienced it. It’s no fun, but it’s not the end of the world. It’s just a matter of hunting down the clues to figure out what it is about certain pages on your website that Google finds so horrible. Of course, before you can do that, you need to find out exactly what pages Google doesn’t like.
To do this, you need to track what pages lost the most traffic after the drop from the Google algorithm change. The data that you need to find those pages includes the date before and after the major drop. In Google Analytics, just hover your mouse along the traffic graph somewhere in front of the drop. Make a note of that date.
Next, hover the mouse on the line somewhere after the drop and make a note of that date. In the graph above, the huge drop came on November 17th, so the slice of time before would be Nov 14-16, and the slice of time after would be November 18-20.
Use Landing Pages From Google Traffic
The next step is to figure out which pages on your site shifted the most — but only in terms of traffic from Google. You want to ignore everything else. To do this, create a custom report and make sure that you at least have Pageviews or Visits as a metric, and use Landing Page as a dimension.
Finally, you want to filter for only traffic with a Source of “google” and a Medium of “organic”. That filter is set up as shown above. Save this custom report.
Next, at the top of the chart itself, you want to set the date range of the graph to the 2-3 days just after the traffic drop from the algorithm change. Then, click the checkbox next to “Compare to”, change the dropdown box to “Custom”, and set the comparison date to the period of days just before the major traffic drop.
Above, I’ve set the chart to compare traffic 3 days before the algorithm drop to 3 days after. This custom report will show me the change in traffic between those two time frames. What we’re looking for are the largest negative values for % Change.
The problem is that if you have a large website, you’re going to have hundreds of pages or more that turn up in these results, and the results aren’t sorted by % Change. So, you need an easy way to sort the results by the change in traffic and find your biggest problem pages. To start this process, change the number of results (at the bottom of the results table) from 10 to something like 1000. Then export the Google Analytics data to a CSV report.
Calculate And Find Largest %Change in Visits
The CSV file gives you the ability to use Excel for doing your data analysis. The data is brought in the form of two time ranges for each title, and the traffic data for each time range.
What you want to do is create an extra column, name it “Change” and calculate the difference between the later time range from the older time range. A drop in traffic should show up as negative.
Copy this same calculation down through your sheet. Once you have all the values, you may want to delete the rows with the smaller numbers, just to clean things up and make your analysis easier. Then, when you’re ready, use Excel’s Sort feature to sort by the Change column, and make sure to sort from smallest to largest value.
That’s all there is to it. Now you have a listing of the landing pages on your website that received the biggest drop in organic traffic due to the algorithm update. This will give you a clear indication of what Google felt were your biggest problem pages — and likely the pages that lost the most rank position in search results.
Fix Problems and Recover Your Traffic!
Now that you have your top 50 to 100 (or more) pages on your site that are the problem, your next task is to open up those pages and really give them a good, hard look. Are they useful pages to your readers? Are they well written, well formatted, and something that someone would feel like sharing or bookmarking?
Some simple things to check are whether the primary keyword topic in the title is used too many times throughout the article. Don’t forget to view the page source and check how many times the keywords show up in the code. Sometimes, you may be surprised to discover you’ve used a word over 100 times in a short article — something that will surely get you in trouble. You also want to make sure to check your links in the article so that they only go to credible, authoritative and useful sites. Make sure your images are clear and make sense. Make sure you haven’t made any common SEO mistakes.
The truth is that Google’s rules are very fluid, and change often. Things that you did years ago to comply with what Google wants may no longer apply, and may actually get you in trouble, so going back and fixing those pages can help you stay on good graces with the Google Gods.