Guides to determine if a client's website has been penalized?
-
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized?
I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines.
Thanks!
-
Good point about the Change History - at least that will catch things like new filters.
Understood about the external doc being easier for some clients not used to working in Analytics. When going that route, I at lease try to get them working in a shared document - either in Google Docs or at least a shared doc in a DropBox or something.
That way there are fewer issues with trying to figure out who has the most current version, and it's available to you when you need it - like when you are doing a normal monthly review but want to quickly check what you think might be an anomaly - without having to make a request for the doc.
P.
-
Thanks Paul. Maybe now that Analytics will be showing change history that will also save time too.
I like your idea about an external document. That seems more approachable than having our clients who are not so comfortable make their annotations in Google Analytics.
-
Glad it was helpful, Ellen, and thanks for the kind words.
And you've hit exactly the challenge so many of us encounter with clients - we're just not kept in the loop of all the things they're doing. (and this critically includes changes to site code and analytics as I mentioned, not just marketing efforts) The only effective way I've found is to make the client responsible for enforcing that their devs and marketers keep the annotations (or if necessary some external file, blech) up to date.
Otherwise you can easily waste hours chasing an issue, only to later find out someone goofed up changing a filter in analytics, for example. Not that that has ever happened to me
That's how I pitch this to clients - either work out a way to make certain it's kept up to date on an ongoing basis, or risk having to pay me for many hours of extra work every time I'm asked to try to track down an issue or to assess overall traffic health. Or maybe even waste big money with a wrong diagnosis because not enough info was provided.
More and more, effective analysis needs to take into account many more cross-channel aspects. The only way to do this effectively is to have those cross-channel efforts recorded in some way right in the Analytics.
Good luck!
Paul
-
It's definitely not a helluva lot to throw at me, but instead is exactly the sort of thing I was looking for! Thank you so much Paul for such a thorough answer. This is definitely the direction I've been moving toward.
The most difficult thing with my clients is there are so many hands in the pot, and they often don't let us know what's going on with their external marketing efforts. Additionally, some of our clients have multiple marketing agencies making changes to their websites, and there are multiple admins on the Analytics.
Holding the client responsible for the updates is a great idea, and it would just take a lot more prodding on our end.
This discussion was spurred by a client that saw a major decline in non-paid search traffic (Google only) over the last quarter of 2012. There were no penalties in GWT, so this leads me to believe as you said it was an algo update. I'm going to use your tips to try and further isolate the affected areas.
I really appreciate the time you took to answer my question. Thanks again.
-
Ellen, there are just far too many reasons why a site's traffic might fluctuate to ever be able to have a guide that can allow for them, then detect the ones that are "harmful".
This is why SEO is referred to as both art and science, unfortunately.
First - to be clear... If a site has actually been penalized by the search engines, they will send you notification through Bing and Google Webmaster Tools. So you must be certain that both those tools have up-to-date, monitored email address in their notifications settings.
Once you've discounted actual penalties, you're left with fluctuations due to changes in the algorithm. These are not "penalties" in the search engines' eyes, just corrections to the ranking algorithm that happen to affect you.
The best method I've found for spotting these is a combination of segmenting data, keeping accurate records about your own site marketing activities, and monitoring the dates of announced algorithm changes.
The idea here is to try to eliminate as many variable as possible for why traffic might have changed, making it easier to spot changes attributable specifically to algorithm changes. Which would then point you to tactics you might need to use to mitigate the effect of the algo change.
In order to do this, you'll need to do the following:
Track Marketing Efforts & Site Changes
Keep records about your site marketing and structural, coding changes.Anytime you do marketing that could affect site traffic, enter the date and info as an Annotation in your Google Analytics. This includes on and offline things like launching magazine or radio advertising, adding new banner or PPC ads, getting coverage in the media, etc. Anything that could conceivably be causing more people to become aware of you and search for your site. (Remember, just because your ad gives your website address doesn't mean people will remember it. Many will remember your company name or service and will Google it later)
Also keep track of ANY changes made to your website structure - changes in code, robots.txt, .htaccess, canonicals, Analytics configuration etc
Track Announced Algorithm Changes
Use this page http://www.seomoz.org/google-algorithm-change to constantly add dates and info about algo updates into your site's AnnotationsSegment Data Ensure you're only looking at organic search data.
This may seem obvious, but a lot of people miss it. Algorithmic changes are only going to affect your organic search data. So you must ensure you are only looking at non-paid search traffic in your analysis. Fortunately, there's a pre-built Advanced segment for that in Google Analytics. If you're trying to track Google changes specifically, you can further segment to show only Google traffic (i.e. exclude Bing and Yahoo.)
Bonus tip - non-sampled data only Make sure you've asked Analytics to show your reports using as little data-sampling as possible. This will make the data vastly more accurate, although the reports will be a little slower. Definitely a worthwhile tradeoff.
Here's my method:
- pick a date range you're concerned about - the shorter the range the easier to spot anomalies in the graph. Six to eight weeks at a time may be best. Or pick a short range from before and after a date you think you encountered a problem
- segment your data specifically to Google unpaid search, and select as little data-sampling as possible
- look at the traffic line in your graph. Anywhere you see an unexpected drop in traffic (allowing for weekly fluctuations) look for an annotation below that date that might explain it.
So in practice the process might look like this. I'm worried about a traffic drop toward the end of January. I select a date range in Analytics of Jan 1 to Feb 15, and I segment my data to show just Google non-paid traffic for that range. I hit the little checkerboard icon in the top right under the date range and move the slider for Highest Precision.
After dong this, I look at the general pattern of organic traffic to the site. There seems to be the usual ebb and flow of lower weekend traffic, with a small spike of traffic on Jan 15th. When I look just below that date, I notice that I've entered an Annotation. (It'll show up as a little tiny clickable comment bubble). When I read the Annotation I created, it tells me we got a mentioned in the newspaper that day. So I now remember where that spike came from. The traffic then pretty much settles back to normal a day or two after as expected.
Then I notice an unusual drop in traffic around January 23. When I again check for Annotations I've created, I realize there was a Panda update on Jan 22. Since there was no other marketing activity mentioned around that date (like a radio ad ending, for example) I can be pretty sure the sole cause of that drop was the Panda change. And since I know Panda is mostly about devaluing thin or low-value content, I now have somewhere to start looking.
I would then start looking at the non-paid search traffic from specific keywords to see if any group of keywords suffered most heavily. If I can find a pattern to the search terms that dropped, I know I've found the topic area of my site where I need to build some better content to help recover the traffic.
The reason i need to be tracking the marketing efforts as well as the algo updates, is I don't want to miss-attribute the problem. (Correlation instead of causality) For example, if I had a major marketing campaign wrap up on the 20 or 21st of January, that could very well have accounted for the traffic drop, and the Panda update was merely a coincidence.
I wouldn't want to have wasted a whole lot of time chasing the Panda problem, when in fact it was a normal drop due to a cutoff in the marketing campaign. But i would have missed that if I hadn't been tracking the marketing. (For clients sites, you'll need to make the client responsible for keeping the marketing Annotations up to date.)
As you can see, this isn't easy, and it takes laying some groundwork, but it goes a long way to helping you figure out where to focus when you start trying to figure out whether you've been affected by an algo update, and therefore where to spend your energy on fixes.
I know this a helluva lot to throw at you, but the question you asked doesn't have an easy answer and i didn't want to shortchange you with an overly simplistic response. Be sure to ask followup questions for the stuff I haven't' explained clearly enough.
Hope that helps.
Paul
-
Hi,
I am not sure if there is a guide that can tell you exactly why your ranking went down. There are many factors as you mentioned that can cause this such as competitor doing better SEO, getting more quality links and etc. It would be nice to have a guide to learn about what is happening. I may be wrong about this. Hope someone can shed light in this.
-
Thanks TommyTan.
I am definitely referring to the search engines.
I do use Google Webmaster Tools, and haven't seen any email notifications from GWT regarding any spammy link profiles, etc.
I'm more concerned with finding out if it's just normal fluctuation in keyword ranking and traffic, or if there is something else going on.
Sometimes there's so many factors that all could be playing a role outside of any penalties, it would be great to find a guide to help you diagnose if it's just seasonal traffic, keyword rank fluctuation or something more serious.
-
Hi,
I may be way off the chart here but when we talk about website' being penalized, I believe it is mostly related to the search engine. I am not sure what would be penalized that may not be realted to the engines. The best tool that connects to a user's website is the Webmaster Tool. If anything is detected, the webmaster will be notified and a notice will also be available on the Google Webmasters Tool such as unnatural linking, duplicate HTML tags and etc.
Hope this helps =]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much content is duplicate content? Differentiate between website pages, help-guides and blog-posts.
Hi all, I wonder that duplicate content is the strong reason beside our ranking drop. We have multiple pages of same "topic" (not exactly same content; not even 30% similar) spread across different pages like website pages (product info), blog-posts and helpguides. This happens with many websites and I wonder is there any specific way we need to differentiate the content? Does Google find the difference across website pages and blog-pots of same topic? Any good reference about this? Thanks
Algorithm Updates | | vtmoz0 -
Rel canonical on every page of wordpress CMS website
Can we have rel=canonical across all pages of a wordpress CMS website? I don't know why same page has been as canonical but not for duplicate pages
Algorithm Updates | | vtmoz1 -
Can Google penalize a country keyword
Hello again guys Thank you for your previous help with www.kids-academy.co.uk - we are slowly getting there! I wanted to ask something I cannot seem to find an answer to, can Google penalize you by country? By this I mean; Search term
Algorithm Updates | | LeanneSEO
Nursery franchise UAE Page 1
Nursery franchise UK Nowhere to be found! The page in question (well a section of the site) has been optimised for UK, however, as they do have a sister site in the UAE, it mentions those areas too. The pages I have been working on are now ranking reasonably well to say there is a long way to go, but for long tailed keywords NOT including anything to do with the UK. There are no naughty backlinks with the anchor text to do with the UK, the server is hosted in the UK, it is a .co.uk URL (no geotagging but I would like to know if this is of any use with this type of URL, everything says no, but it cant harm can it?) - is it possible Google due to bad practices in the past have slapped a penalty on the specific keyword area? Not something I have come across previously but I am scratching my head over here! Time for a brew break 😄 Thanks in advance guys! Leanne1 -
My Website is not stable
My website is daily showing different position on google.com on all its keywords like one day first page on all keywords next day 3rd page on all keywords for the last one month. Is this a Google Dance ?? what can be its period ? and what is the solution to handle it ??
Algorithm Updates | | mnkpso0 -
Parallel mobile website
After researching and comparing desktop keywords vs mobile keywords, I came to the conclusion that they are pretty much the same, with minor variations. The new keywords for mobile website do not have a great potential for bringing in much traffic. If I don't want to go after those new mobile, do I need to build separate links to our parallel mobile website (URL) to rank better or the link building for my desktop website would be enough to rank for my parallel mobile website as well?
Algorithm Updates | | echo10 -
How will SEO be impacted by Google's new Knowledge Graph?
With the recent announcement of Googles new Knowledge Graph, the SERP will be different. Will this present a new set of SEO best practices?
Algorithm Updates | | PerriCline0 -
How did a competitor's brand name get in google's related search list?
When doing a google search for the term "ulster county real estate" the related search list at the bottom of the serp includes 7 obviously related search terms and 1 brand name of a competitor. (see attachment) The competitor doesn't rank for this term organically at all yet he enjoys a link on the first page with those of us that do by being in the related search list? I don't get it. Anyone know how something like this happens? Innhs.png
Algorithm Updates | | jhogan801 -
Why is my client's website, ranking higher for serps on bing and yahoo for competitive keywords
My client is in the competitive Private jet charter space. Why is my client's website, ranking higher for serps on bing and yahoo for competitive keywords phrases? like Private jet charter as an example. http://www.jetcharter.com/
Algorithm Updates | | AndrewSEO0