Guides to determine if a client's website has been penalized?
-
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized?
I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines.
Thanks!
-
Good point about the Change History - at least that will catch things like new filters.
Understood about the external doc being easier for some clients not used to working in Analytics. When going that route, I at lease try to get them working in a shared document - either in Google Docs or at least a shared doc in a DropBox or something.
That way there are fewer issues with trying to figure out who has the most current version, and it's available to you when you need it - like when you are doing a normal monthly review but want to quickly check what you think might be an anomaly - without having to make a request for the doc.
P.
-
Thanks Paul. Maybe now that Analytics will be showing change history that will also save time too.
I like your idea about an external document. That seems more approachable than having our clients who are not so comfortable make their annotations in Google Analytics.
-
Glad it was helpful, Ellen, and thanks for the kind words.
And you've hit exactly the challenge so many of us encounter with clients - we're just not kept in the loop of all the things they're doing. (and this critically includes changes to site code and analytics as I mentioned, not just marketing efforts) The only effective way I've found is to make the client responsible for enforcing that their devs and marketers keep the annotations (or if necessary some external file, blech) up to date.
Otherwise you can easily waste hours chasing an issue, only to later find out someone goofed up changing a filter in analytics, for example. Not that that has ever happened to me
That's how I pitch this to clients - either work out a way to make certain it's kept up to date on an ongoing basis, or risk having to pay me for many hours of extra work every time I'm asked to try to track down an issue or to assess overall traffic health. Or maybe even waste big money with a wrong diagnosis because not enough info was provided.
More and more, effective analysis needs to take into account many more cross-channel aspects. The only way to do this effectively is to have those cross-channel efforts recorded in some way right in the Analytics.
Good luck!
Paul
-
It's definitely not a helluva lot to throw at me, but instead is exactly the sort of thing I was looking for! Thank you so much Paul for such a thorough answer. This is definitely the direction I've been moving toward.
The most difficult thing with my clients is there are so many hands in the pot, and they often don't let us know what's going on with their external marketing efforts. Additionally, some of our clients have multiple marketing agencies making changes to their websites, and there are multiple admins on the Analytics.
Holding the client responsible for the updates is a great idea, and it would just take a lot more prodding on our end.
This discussion was spurred by a client that saw a major decline in non-paid search traffic (Google only) over the last quarter of 2012. There were no penalties in GWT, so this leads me to believe as you said it was an algo update. I'm going to use your tips to try and further isolate the affected areas.
I really appreciate the time you took to answer my question. Thanks again.
-
Ellen, there are just far too many reasons why a site's traffic might fluctuate to ever be able to have a guide that can allow for them, then detect the ones that are "harmful".
This is why SEO is referred to as both art and science, unfortunately.
First - to be clear... If a site has actually been penalized by the search engines, they will send you notification through Bing and Google Webmaster Tools. So you must be certain that both those tools have up-to-date, monitored email address in their notifications settings.
Once you've discounted actual penalties, you're left with fluctuations due to changes in the algorithm. These are not "penalties" in the search engines' eyes, just corrections to the ranking algorithm that happen to affect you.
The best method I've found for spotting these is a combination of segmenting data, keeping accurate records about your own site marketing activities, and monitoring the dates of announced algorithm changes.
The idea here is to try to eliminate as many variable as possible for why traffic might have changed, making it easier to spot changes attributable specifically to algorithm changes. Which would then point you to tactics you might need to use to mitigate the effect of the algo change.
In order to do this, you'll need to do the following:
Track Marketing Efforts & Site Changes
Keep records about your site marketing and structural, coding changes.Anytime you do marketing that could affect site traffic, enter the date and info as an Annotation in your Google Analytics. This includes on and offline things like launching magazine or radio advertising, adding new banner or PPC ads, getting coverage in the media, etc. Anything that could conceivably be causing more people to become aware of you and search for your site. (Remember, just because your ad gives your website address doesn't mean people will remember it. Many will remember your company name or service and will Google it later)
Also keep track of ANY changes made to your website structure - changes in code, robots.txt, .htaccess, canonicals, Analytics configuration etc
Track Announced Algorithm Changes
Use this page http://www.seomoz.org/google-algorithm-change to constantly add dates and info about algo updates into your site's AnnotationsSegment Data Ensure you're only looking at organic search data.
This may seem obvious, but a lot of people miss it. Algorithmic changes are only going to affect your organic search data. So you must ensure you are only looking at non-paid search traffic in your analysis. Fortunately, there's a pre-built Advanced segment for that in Google Analytics. If you're trying to track Google changes specifically, you can further segment to show only Google traffic (i.e. exclude Bing and Yahoo.)
Bonus tip - non-sampled data only Make sure you've asked Analytics to show your reports using as little data-sampling as possible. This will make the data vastly more accurate, although the reports will be a little slower. Definitely a worthwhile tradeoff.
Here's my method:
- pick a date range you're concerned about - the shorter the range the easier to spot anomalies in the graph. Six to eight weeks at a time may be best. Or pick a short range from before and after a date you think you encountered a problem
- segment your data specifically to Google unpaid search, and select as little data-sampling as possible
- look at the traffic line in your graph. Anywhere you see an unexpected drop in traffic (allowing for weekly fluctuations) look for an annotation below that date that might explain it.
So in practice the process might look like this. I'm worried about a traffic drop toward the end of January. I select a date range in Analytics of Jan 1 to Feb 15, and I segment my data to show just Google non-paid traffic for that range. I hit the little checkerboard icon in the top right under the date range and move the slider for Highest Precision.
After dong this, I look at the general pattern of organic traffic to the site. There seems to be the usual ebb and flow of lower weekend traffic, with a small spike of traffic on Jan 15th. When I look just below that date, I notice that I've entered an Annotation. (It'll show up as a little tiny clickable comment bubble). When I read the Annotation I created, it tells me we got a mentioned in the newspaper that day. So I now remember where that spike came from. The traffic then pretty much settles back to normal a day or two after as expected.
Then I notice an unusual drop in traffic around January 23. When I again check for Annotations I've created, I realize there was a Panda update on Jan 22. Since there was no other marketing activity mentioned around that date (like a radio ad ending, for example) I can be pretty sure the sole cause of that drop was the Panda change. And since I know Panda is mostly about devaluing thin or low-value content, I now have somewhere to start looking.
I would then start looking at the non-paid search traffic from specific keywords to see if any group of keywords suffered most heavily. If I can find a pattern to the search terms that dropped, I know I've found the topic area of my site where I need to build some better content to help recover the traffic.
The reason i need to be tracking the marketing efforts as well as the algo updates, is I don't want to miss-attribute the problem. (Correlation instead of causality) For example, if I had a major marketing campaign wrap up on the 20 or 21st of January, that could very well have accounted for the traffic drop, and the Panda update was merely a coincidence.
I wouldn't want to have wasted a whole lot of time chasing the Panda problem, when in fact it was a normal drop due to a cutoff in the marketing campaign. But i would have missed that if I hadn't been tracking the marketing. (For clients sites, you'll need to make the client responsible for keeping the marketing Annotations up to date.)
As you can see, this isn't easy, and it takes laying some groundwork, but it goes a long way to helping you figure out where to focus when you start trying to figure out whether you've been affected by an algo update, and therefore where to spend your energy on fixes.
I know this a helluva lot to throw at you, but the question you asked doesn't have an easy answer and i didn't want to shortchange you with an overly simplistic response. Be sure to ask followup questions for the stuff I haven't' explained clearly enough.
Hope that helps.
Paul
-
Hi,
I am not sure if there is a guide that can tell you exactly why your ranking went down. There are many factors as you mentioned that can cause this such as competitor doing better SEO, getting more quality links and etc. It would be nice to have a guide to learn about what is happening. I may be wrong about this. Hope someone can shed light in this.
-
Thanks TommyTan.
I am definitely referring to the search engines.
I do use Google Webmaster Tools, and haven't seen any email notifications from GWT regarding any spammy link profiles, etc.
I'm more concerned with finding out if it's just normal fluctuation in keyword ranking and traffic, or if there is something else going on.
Sometimes there's so many factors that all could be playing a role outside of any penalties, it would be great to find a guide to help you diagnose if it's just seasonal traffic, keyword rank fluctuation or something more serious.
-
Hi,
I may be way off the chart here but when we talk about website' being penalized, I believe it is mostly related to the search engine. I am not sure what would be penalized that may not be realted to the engines. The best tool that connects to a user's website is the Webmaster Tool. If anything is detected, the webmaster will be notified and a notice will also be available on the Google Webmasters Tool such as unnatural linking, duplicate HTML tags and etc.
Hope this helps =]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks to internal pages help website to rank better or vice versa?
Hi Moz community, We have our backlinks mostly pointed to our homepage. We are trying to rank better and not having minimum number of backlinks to our internal pages is one of the things I worry about. Backlinks to homepage alone help in ranking internal pages or backlinks to internal pages help in ranking homepage? Or both required? Thanks
Algorithm Updates | | vtmoz0 -
Does Google's Information Box Seem Shady to you?
So I just had this thought, Google returns information boxes for certain search terms. Recently I noticed one word searches usually return a definition. For example if you type in the word "occur" or "happenstance" or "frustration" you get a definition information box. But what I didn't see is a reference to where they are getting or have gotten this information. Now it could very well be they built their own database of definitions, and if they did great, but here is where it seems a bit grey to me... Did Google hire a team of people to populate the database, or did they just write an algorithm to comb a dictionary website and stick the information in their database. The latter seems more likely. If that is what happened then Google basically stole the information from somebody to claim it as their own, which makes me worry, if you coin a term, lets say "lumpy stumpy" and it goes mainstream which would entail a lot of marketing, and luck. Would Google just add it to its database and forgo giving you credit for its creation? From a user perspective I love these information boxes, but just like Google expects us webmasters to do, they should be giving credit where credit is due... don't you think? I'm not plugged in to the happenings of Google so maybe they bought the rights, or maybe they bought or hold a majority of shares in some definition type company (they have the cash) but it just struck me as odd not seeing a reference to a site. What are your thoughts?
Algorithm Updates | | donford1 -
My Website No Longer Appears in Mobile Google Search but Does in Desktop...Why Is This?
For a long time my website has appeared in both desktop and mobile search in Google. Yet recently it has stopped appearing in mobile yet still on desktop. Any ideas why this is happening and how to rectify it please? Many Thanks.
Algorithm Updates | | WSIDW0 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
What's better for seo, NOINDEX, or INDEX
Hello Mozers; I am having an issue, my client has 10K pages on their site; in WP, and they have a classified section. Question #1: I am asking, what's better for seo, NOINDEX, or INDEX, for their Classified section. They currently have no SEO plug ins, that fix their errors, and warnings. Question #2: My question is also, do I want the Categories crawled, or INDEXED or NOINDEX? Check out their Campaign results by Moz: Title Element Too Long (> 70 Characters) 32 Too Many On-Page Links 9,032 Missing Meta Description Tag 6,234
Algorithm Updates | | smstv0 -
Vanity URL's and http codes
We have a vanity URL that as recommended is using 301 http code, however it has been discovered the destination URL needs to be updated which creates a problem since most browsers and search engines cache 301 redirects. Is there a good way to figure out when a vanity should be a 301 vs 302/307? If all vanity URL's should use 301, what is the proper way of updating the destination URL? Is it a good rule of thumb that if the vanity URL is only going to be temporary and down the road could have a new destination URL to use 302, and all others 301? Cheers,
Algorithm Updates | | Shawn_Huber0 -
No-follow tags on links in the footer...do it or don't do it?
With some of the great reports SEOMoz has provided I've been able to start to take the correct steps towards fixing crawl issues, on-page issues, etc. One of my websites allows a customer to drill down to their specific state and then their city to apply for an auto loan. The SEOMoz reports told me I had too many links on these pages specifically. One of my ways to remedy this would be to add "no-follow" tags on the links in the footer as well as the links to the cities. Am I steering myself in the right/wrong direction? Should I be approaching this problem from a different perspective? Any help is greatly appreciated!
Algorithm Updates | | fergseo0 -
How long does a news article stay on Google's 'News' section on the SERP?
Our site is recognised as a news source for our niche - was just wondering if anyone had any idea how long the news story stays on the front page of the SERP once Google picks it up?
Algorithm Updates | | DanHill0