Guides to determine if a client's website has been penalized?
-
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized?
I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines.
Thanks!
-
Good point about the Change History - at least that will catch things like new filters.
Understood about the external doc being easier for some clients not used to working in Analytics. When going that route, I at lease try to get them working in a shared document - either in Google Docs or at least a shared doc in a DropBox or something.
That way there are fewer issues with trying to figure out who has the most current version, and it's available to you when you need it - like when you are doing a normal monthly review but want to quickly check what you think might be an anomaly - without having to make a request for the doc.
P.
-
Thanks Paul. Maybe now that Analytics will be showing change history that will also save time too.
I like your idea about an external document. That seems more approachable than having our clients who are not so comfortable make their annotations in Google Analytics.
-
Glad it was helpful, Ellen, and thanks for the kind words.
And you've hit exactly the challenge so many of us encounter with clients - we're just not kept in the loop of all the things they're doing. (and this critically includes changes to site code and analytics as I mentioned, not just marketing efforts) The only effective way I've found is to make the client responsible for enforcing that their devs and marketers keep the annotations (or if necessary some external file, blech) up to date.
Otherwise you can easily waste hours chasing an issue, only to later find out someone goofed up changing a filter in analytics, for example. Not that that has ever happened to me
That's how I pitch this to clients - either work out a way to make certain it's kept up to date on an ongoing basis, or risk having to pay me for many hours of extra work every time I'm asked to try to track down an issue or to assess overall traffic health. Or maybe even waste big money with a wrong diagnosis because not enough info was provided.
More and more, effective analysis needs to take into account many more cross-channel aspects. The only way to do this effectively is to have those cross-channel efforts recorded in some way right in the Analytics.
Good luck!
Paul
-
It's definitely not a helluva lot to throw at me, but instead is exactly the sort of thing I was looking for! Thank you so much Paul for such a thorough answer. This is definitely the direction I've been moving toward.
The most difficult thing with my clients is there are so many hands in the pot, and they often don't let us know what's going on with their external marketing efforts. Additionally, some of our clients have multiple marketing agencies making changes to their websites, and there are multiple admins on the Analytics.
Holding the client responsible for the updates is a great idea, and it would just take a lot more prodding on our end.
This discussion was spurred by a client that saw a major decline in non-paid search traffic (Google only) over the last quarter of 2012. There were no penalties in GWT, so this leads me to believe as you said it was an algo update. I'm going to use your tips to try and further isolate the affected areas.
I really appreciate the time you took to answer my question. Thanks again.
-
Ellen, there are just far too many reasons why a site's traffic might fluctuate to ever be able to have a guide that can allow for them, then detect the ones that are "harmful".
This is why SEO is referred to as both art and science, unfortunately.
First - to be clear... If a site has actually been penalized by the search engines, they will send you notification through Bing and Google Webmaster Tools. So you must be certain that both those tools have up-to-date, monitored email address in their notifications settings.
Once you've discounted actual penalties, you're left with fluctuations due to changes in the algorithm. These are not "penalties" in the search engines' eyes, just corrections to the ranking algorithm that happen to affect you.
The best method I've found for spotting these is a combination of segmenting data, keeping accurate records about your own site marketing activities, and monitoring the dates of announced algorithm changes.
The idea here is to try to eliminate as many variable as possible for why traffic might have changed, making it easier to spot changes attributable specifically to algorithm changes. Which would then point you to tactics you might need to use to mitigate the effect of the algo change.
In order to do this, you'll need to do the following:
Track Marketing Efforts & Site Changes
Keep records about your site marketing and structural, coding changes.Anytime you do marketing that could affect site traffic, enter the date and info as an Annotation in your Google Analytics. This includes on and offline things like launching magazine or radio advertising, adding new banner or PPC ads, getting coverage in the media, etc. Anything that could conceivably be causing more people to become aware of you and search for your site. (Remember, just because your ad gives your website address doesn't mean people will remember it. Many will remember your company name or service and will Google it later)
Also keep track of ANY changes made to your website structure - changes in code, robots.txt, .htaccess, canonicals, Analytics configuration etc
Track Announced Algorithm Changes
Use this page http://www.seomoz.org/google-algorithm-change to constantly add dates and info about algo updates into your site's AnnotationsSegment Data Ensure you're only looking at organic search data.
This may seem obvious, but a lot of people miss it. Algorithmic changes are only going to affect your organic search data. So you must ensure you are only looking at non-paid search traffic in your analysis. Fortunately, there's a pre-built Advanced segment for that in Google Analytics. If you're trying to track Google changes specifically, you can further segment to show only Google traffic (i.e. exclude Bing and Yahoo.)
Bonus tip - non-sampled data only Make sure you've asked Analytics to show your reports using as little data-sampling as possible. This will make the data vastly more accurate, although the reports will be a little slower. Definitely a worthwhile tradeoff.
Here's my method:
- pick a date range you're concerned about - the shorter the range the easier to spot anomalies in the graph. Six to eight weeks at a time may be best. Or pick a short range from before and after a date you think you encountered a problem
- segment your data specifically to Google unpaid search, and select as little data-sampling as possible
- look at the traffic line in your graph. Anywhere you see an unexpected drop in traffic (allowing for weekly fluctuations) look for an annotation below that date that might explain it.
So in practice the process might look like this. I'm worried about a traffic drop toward the end of January. I select a date range in Analytics of Jan 1 to Feb 15, and I segment my data to show just Google non-paid traffic for that range. I hit the little checkerboard icon in the top right under the date range and move the slider for Highest Precision.
After dong this, I look at the general pattern of organic traffic to the site. There seems to be the usual ebb and flow of lower weekend traffic, with a small spike of traffic on Jan 15th. When I look just below that date, I notice that I've entered an Annotation. (It'll show up as a little tiny clickable comment bubble). When I read the Annotation I created, it tells me we got a mentioned in the newspaper that day. So I now remember where that spike came from. The traffic then pretty much settles back to normal a day or two after as expected.
Then I notice an unusual drop in traffic around January 23. When I again check for Annotations I've created, I realize there was a Panda update on Jan 22. Since there was no other marketing activity mentioned around that date (like a radio ad ending, for example) I can be pretty sure the sole cause of that drop was the Panda change. And since I know Panda is mostly about devaluing thin or low-value content, I now have somewhere to start looking.
I would then start looking at the non-paid search traffic from specific keywords to see if any group of keywords suffered most heavily. If I can find a pattern to the search terms that dropped, I know I've found the topic area of my site where I need to build some better content to help recover the traffic.
The reason i need to be tracking the marketing efforts as well as the algo updates, is I don't want to miss-attribute the problem. (Correlation instead of causality) For example, if I had a major marketing campaign wrap up on the 20 or 21st of January, that could very well have accounted for the traffic drop, and the Panda update was merely a coincidence.
I wouldn't want to have wasted a whole lot of time chasing the Panda problem, when in fact it was a normal drop due to a cutoff in the marketing campaign. But i would have missed that if I hadn't been tracking the marketing. (For clients sites, you'll need to make the client responsible for keeping the marketing Annotations up to date.)
As you can see, this isn't easy, and it takes laying some groundwork, but it goes a long way to helping you figure out where to focus when you start trying to figure out whether you've been affected by an algo update, and therefore where to spend your energy on fixes.
I know this a helluva lot to throw at you, but the question you asked doesn't have an easy answer and i didn't want to shortchange you with an overly simplistic response. Be sure to ask followup questions for the stuff I haven't' explained clearly enough.
Hope that helps.
Paul
-
Hi,
I am not sure if there is a guide that can tell you exactly why your ranking went down. There are many factors as you mentioned that can cause this such as competitor doing better SEO, getting more quality links and etc. It would be nice to have a guide to learn about what is happening. I may be wrong about this. Hope someone can shed light in this.
-
Thanks TommyTan.
I am definitely referring to the search engines.
I do use Google Webmaster Tools, and haven't seen any email notifications from GWT regarding any spammy link profiles, etc.
I'm more concerned with finding out if it's just normal fluctuation in keyword ranking and traffic, or if there is something else going on.
Sometimes there's so many factors that all could be playing a role outside of any penalties, it would be great to find a guide to help you diagnose if it's just seasonal traffic, keyword rank fluctuation or something more serious.
-
Hi,
I may be way off the chart here but when we talk about website' being penalized, I believe it is mostly related to the search engine. I am not sure what would be penalized that may not be realted to the engines. The best tool that connects to a user's website is the Webmaster Tool. If anything is detected, the webmaster will be notified and a notice will also be available on the Google Webmasters Tool such as unnatural linking, duplicate HTML tags and etc.
Hope this helps =]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anyone suspect that a site's total page count affects SEO?
I've been trying to find out the underlying reason why so many websites are ranked higher than mine despite seemingly having far worse links. I've spent a lot of time researching and have read through all the general advice about what could possibly be hurting my site's SEO, from page speed to h1 tags to broken links, and all the various on-page SEO optimization stuff....so the issue here isn't very obvious. From viewing all of my competitors, they seem to have a much higher number of web pages on their sites than mine does. My site currently has 20 pages or so and most of my competitors are well in the hundreds, so I'm wondering if this could potentially be part of the issue here. I know Google has never officially said that page number matters, but does anyone suspect that perhaps page count matters towards SEO and that competing sites with more total pages than you might have an advantage SEOwise?
Algorithm Updates | | ButtaC1 -
Sub domains take away the pagerank/link juice of main website?
Hi all, I would like to know how sub domains will be collaborated with domain or main website in terms of authority, backlinks, link juice, etc...Link juice will be passed to sub domains from domain or vice versa? If there are large number of pages in sub domain, will that going to impact the ranking of main domain? Backlinks of sub domain are going to be a deciding factor for website ranking too? Thanks
Algorithm Updates | | vtmoz0 -
Bad Grammar's Effect on Rankings
Mozzers, I have a client who's brand style guide dictates that they write in all lowercase letters. Do you think this will hurt rankings? Nails
Algorithm Updates | | matt.nails1 -
Website traffic dropped 50% after 14th November same day GWT reported a DNS error
HI there. On 14th November GWT reported a DNS error on my site I checked with my hosts but they said there was nothing wrong. I then went searching for answers and found it happened to lot of people on that specific day see http://moz.com/blog/was-there-a-november-14th-google-update. After that time my website traffic dropped by 50% over the period of a week and is still sitting on 50% of what it was. I then moved the site to VPS, had a few DNS errors - that were the cause of my hosts - so i moved back to shared hosting last weekend and now DNS issues are solved. However the original DNS issue is still unknown I dont know what went wrong and want to rectify issue. I dont sell ads and write original content 4 times a day. I have a miniscule bounce rate, my site speed is okay and i dont stuff keywords in my content although i was careless with my <alt tags="">so they could be considered keyword stuffing (and i have up to 10 images on one post). But i have been removing all the keywords in my images theres over 3000 posts so its taking time). My impressions have dropped from 10,000 a day to 2,500 and I have no idea why.</alt> My website has been building traffic consistently for the last 2 years. Only now has it crashed a bit. Have you any advice on what I can do to solve this problem or rather find the cause of the issue. Im not a pro seo person and do this blog in my free time so am not an expert on data analyzing etc.... thanks alot
Algorithm Updates | | mutant20080 -
Can I have the same item description on Amazon, eBay and my website?
Hi guys, After looking on the Internet and reading the Learn SEO section on this site, I've realised that Google doesn't like duplicate content and penalises it, whether that's duplication on your own site or of another site's content. We are an online retailer currently selling on different platforms including Amazon, eBay and our own ecommerce webstore. Is it okay to have the same item description (i.e. main page copy) on each of these sites, or will our search rankings get negatively impacted? Thank you in advance, I have researched on this issue also but I couldn't find a concrete answer. Tanay
Algorithm Updates | | goforgreen0 -
How can we start to improve Domain MozRank & MozTrust for our website?
A simple question maybe, but how and where do we start if we want to improve our 'Domain MozRank & Moztrust', 'assuming of course that by improving both these we will improve our rankings with Google plus sales?
Algorithm Updates | | ewanTHH0 -
Why some results in SERP have a www. and some don't
Hello all, If this is posted twice, I didn't mean for it to be - but it looks like last time I tried to post this question it didn't post. This is my question: How come some results on Google's SERP page are shown with a "www" and some are not? Does this effect SEO at all? I am including a screen shot so you can see what I mean. The Geary Interactive result has a "www" in front of while ingenexdigital doesn't. R6GLL.png
Algorithm Updates | | digitalops0 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0