Guides to determine if a client's website has been penalized?
-
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized?
I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines.
Thanks!
-
Good point about the Change History - at least that will catch things like new filters.
Understood about the external doc being easier for some clients not used to working in Analytics. When going that route, I at lease try to get them working in a shared document - either in Google Docs or at least a shared doc in a DropBox or something.
That way there are fewer issues with trying to figure out who has the most current version, and it's available to you when you need it - like when you are doing a normal monthly review but want to quickly check what you think might be an anomaly - without having to make a request for the doc.
P.
-
Thanks Paul. Maybe now that Analytics will be showing change history that will also save time too.
I like your idea about an external document. That seems more approachable than having our clients who are not so comfortable make their annotations in Google Analytics.
-
Glad it was helpful, Ellen, and thanks for the kind words.
And you've hit exactly the challenge so many of us encounter with clients - we're just not kept in the loop of all the things they're doing. (and this critically includes changes to site code and analytics as I mentioned, not just marketing efforts) The only effective way I've found is to make the client responsible for enforcing that their devs and marketers keep the annotations (or if necessary some external file, blech) up to date.
Otherwise you can easily waste hours chasing an issue, only to later find out someone goofed up changing a filter in analytics, for example. Not that that has ever happened to me
That's how I pitch this to clients - either work out a way to make certain it's kept up to date on an ongoing basis, or risk having to pay me for many hours of extra work every time I'm asked to try to track down an issue or to assess overall traffic health. Or maybe even waste big money with a wrong diagnosis because not enough info was provided.
More and more, effective analysis needs to take into account many more cross-channel aspects. The only way to do this effectively is to have those cross-channel efforts recorded in some way right in the Analytics.
Good luck!
Paul
-
It's definitely not a helluva lot to throw at me, but instead is exactly the sort of thing I was looking for! Thank you so much Paul for such a thorough answer. This is definitely the direction I've been moving toward.
The most difficult thing with my clients is there are so many hands in the pot, and they often don't let us know what's going on with their external marketing efforts. Additionally, some of our clients have multiple marketing agencies making changes to their websites, and there are multiple admins on the Analytics.
Holding the client responsible for the updates is a great idea, and it would just take a lot more prodding on our end.
This discussion was spurred by a client that saw a major decline in non-paid search traffic (Google only) over the last quarter of 2012. There were no penalties in GWT, so this leads me to believe as you said it was an algo update. I'm going to use your tips to try and further isolate the affected areas.
I really appreciate the time you took to answer my question. Thanks again.
-
Ellen, there are just far too many reasons why a site's traffic might fluctuate to ever be able to have a guide that can allow for them, then detect the ones that are "harmful".
This is why SEO is referred to as both art and science, unfortunately.
First - to be clear... If a site has actually been penalized by the search engines, they will send you notification through Bing and Google Webmaster Tools. So you must be certain that both those tools have up-to-date, monitored email address in their notifications settings.
Once you've discounted actual penalties, you're left with fluctuations due to changes in the algorithm. These are not "penalties" in the search engines' eyes, just corrections to the ranking algorithm that happen to affect you.
The best method I've found for spotting these is a combination of segmenting data, keeping accurate records about your own site marketing activities, and monitoring the dates of announced algorithm changes.
The idea here is to try to eliminate as many variable as possible for why traffic might have changed, making it easier to spot changes attributable specifically to algorithm changes. Which would then point you to tactics you might need to use to mitigate the effect of the algo change.
In order to do this, you'll need to do the following:
Track Marketing Efforts & Site Changes
Keep records about your site marketing and structural, coding changes.Anytime you do marketing that could affect site traffic, enter the date and info as an Annotation in your Google Analytics. This includes on and offline things like launching magazine or radio advertising, adding new banner or PPC ads, getting coverage in the media, etc. Anything that could conceivably be causing more people to become aware of you and search for your site. (Remember, just because your ad gives your website address doesn't mean people will remember it. Many will remember your company name or service and will Google it later)
Also keep track of ANY changes made to your website structure - changes in code, robots.txt, .htaccess, canonicals, Analytics configuration etc
Track Announced Algorithm Changes
Use this page http://www.seomoz.org/google-algorithm-change to constantly add dates and info about algo updates into your site's AnnotationsSegment Data Ensure you're only looking at organic search data.
This may seem obvious, but a lot of people miss it. Algorithmic changes are only going to affect your organic search data. So you must ensure you are only looking at non-paid search traffic in your analysis. Fortunately, there's a pre-built Advanced segment for that in Google Analytics. If you're trying to track Google changes specifically, you can further segment to show only Google traffic (i.e. exclude Bing and Yahoo.)
Bonus tip - non-sampled data only Make sure you've asked Analytics to show your reports using as little data-sampling as possible. This will make the data vastly more accurate, although the reports will be a little slower. Definitely a worthwhile tradeoff.
Here's my method:
- pick a date range you're concerned about - the shorter the range the easier to spot anomalies in the graph. Six to eight weeks at a time may be best. Or pick a short range from before and after a date you think you encountered a problem
- segment your data specifically to Google unpaid search, and select as little data-sampling as possible
- look at the traffic line in your graph. Anywhere you see an unexpected drop in traffic (allowing for weekly fluctuations) look for an annotation below that date that might explain it.
So in practice the process might look like this. I'm worried about a traffic drop toward the end of January. I select a date range in Analytics of Jan 1 to Feb 15, and I segment my data to show just Google non-paid traffic for that range. I hit the little checkerboard icon in the top right under the date range and move the slider for Highest Precision.
After dong this, I look at the general pattern of organic traffic to the site. There seems to be the usual ebb and flow of lower weekend traffic, with a small spike of traffic on Jan 15th. When I look just below that date, I notice that I've entered an Annotation. (It'll show up as a little tiny clickable comment bubble). When I read the Annotation I created, it tells me we got a mentioned in the newspaper that day. So I now remember where that spike came from. The traffic then pretty much settles back to normal a day or two after as expected.
Then I notice an unusual drop in traffic around January 23. When I again check for Annotations I've created, I realize there was a Panda update on Jan 22. Since there was no other marketing activity mentioned around that date (like a radio ad ending, for example) I can be pretty sure the sole cause of that drop was the Panda change. And since I know Panda is mostly about devaluing thin or low-value content, I now have somewhere to start looking.
I would then start looking at the non-paid search traffic from specific keywords to see if any group of keywords suffered most heavily. If I can find a pattern to the search terms that dropped, I know I've found the topic area of my site where I need to build some better content to help recover the traffic.
The reason i need to be tracking the marketing efforts as well as the algo updates, is I don't want to miss-attribute the problem. (Correlation instead of causality) For example, if I had a major marketing campaign wrap up on the 20 or 21st of January, that could very well have accounted for the traffic drop, and the Panda update was merely a coincidence.
I wouldn't want to have wasted a whole lot of time chasing the Panda problem, when in fact it was a normal drop due to a cutoff in the marketing campaign. But i would have missed that if I hadn't been tracking the marketing. (For clients sites, you'll need to make the client responsible for keeping the marketing Annotations up to date.)
As you can see, this isn't easy, and it takes laying some groundwork, but it goes a long way to helping you figure out where to focus when you start trying to figure out whether you've been affected by an algo update, and therefore where to spend your energy on fixes.
I know this a helluva lot to throw at you, but the question you asked doesn't have an easy answer and i didn't want to shortchange you with an overly simplistic response. Be sure to ask followup questions for the stuff I haven't' explained clearly enough.
Hope that helps.
Paul
-
Hi,
I am not sure if there is a guide that can tell you exactly why your ranking went down. There are many factors as you mentioned that can cause this such as competitor doing better SEO, getting more quality links and etc. It would be nice to have a guide to learn about what is happening. I may be wrong about this. Hope someone can shed light in this.
-
Thanks TommyTan.
I am definitely referring to the search engines.
I do use Google Webmaster Tools, and haven't seen any email notifications from GWT regarding any spammy link profiles, etc.
I'm more concerned with finding out if it's just normal fluctuation in keyword ranking and traffic, or if there is something else going on.
Sometimes there's so many factors that all could be playing a role outside of any penalties, it would be great to find a guide to help you diagnose if it's just seasonal traffic, keyword rank fluctuation or something more serious.
-
Hi,
I may be way off the chart here but when we talk about website' being penalized, I believe it is mostly related to the search engine. I am not sure what would be penalized that may not be realted to the engines. The best tool that connects to a user's website is the Webmaster Tool. If anything is detected, the webmaster will be notified and a notice will also be available on the Google Webmasters Tool such as unnatural linking, duplicate HTML tags and etc.
Hope this helps =]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website have Caching/Indexing / Ranking Issue
Hi, My Website (https://www.v3cars.com) is not cached or indexed on regular basic from last 15 days. before this it was cached or indexed on regular basic. We are uploading fresh content on daily basic. Currently my new content is not ranked anywhere in Google even after cached or indexed. Please help and suggest. Sandeep - Love to Cars
Algorithm Updates | | onlinesandeep0 -
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
Primary keyword in every page title of website
Hi all, We can see many website page titles are filled with "brand name & primary keyword" at suffix. Just wondering how much this gonna help. Or can we remove "primary keyword" from other non-relevant pages and limit the same to important pages to rank well? Thanks
Algorithm Updates | | vtmoz0 -
Google AMP (accelerated mobile pages), can it be used for non-Google news and Ecommerce Websites?
Mozzers, I've been doing a lot of research on Google's new Accelerated Mobile Pages (AMP) https://moz.com/blog/accelerated-mobile-pages-whiteboard-friday. From what I'm seeing, these AMP version websites are only for Google News-worthy websites such as New York Times, Cosmopolitan, and the BuzzFeeds of the world. But what about Ecommerce websites like Ebay or Amazon? Will AMP versions of "scotch tape" via OfficeDepot work in the SERP's on non-Google News cards?
Algorithm Updates | | Shawn1240 -
Bing's indexed pages vs pages appearing in results
Hi all We're trying to increase our efforts in ranking for our keywords on Bing, and I'm discovering a few unexpected challenges. Namely, Bing is reporting 16000+ pages have been crawled... yet a site:mywebsite.com search on Bing shows less than 1000 results. I'm aware that Duane Forrester has said they don't want to show everything, only the best. If that's the case, what factors must we consider most to encourage Bing's engine to display most if not all of the pages the crawl on my site? I have a few ideas of what may be turning Bing off so to speak (some duplicate content issues, 301 redirects due to URL structure updates), but if there's something in particular we should monitor and/or check, please let us know. We'd like to prioritize 🙂 Thanks!
Algorithm Updates | | brandonRT0 -
Website taken a hit?
We have recently (yesterday 12<sup>th</sup> April) taken a hit for our main keywords it seems that there is no constant fall but it seems to be the most competitive words as anything that ranked purely on content still seems to rank which makes me assume that we have just lost a lot of power from links (are SEO did build quite a few links from article sites he also built a few blog network links which we did not know till we got the webmaster message 3 weeks ago (24<sup>th</sup> may) and we have made him remove them all but some still show which a) he can’t contact or b) were scrapers). But on the other hand we still have decent on site content and some good links from graphic design blogs (review articles) which would suggest a penalty as some sites with poor links and poor on site content are outranking us for a couple of our good keywords. I cannot decide if this is a penalty, keyword anchor text penalty (this is wiping more power out than the bad links) or just devaluation of links (but as said before our good links are still much more powerful than the competitors out ranking us and with our content we should easily not have lost many places). If I was going to come up with an idea it would be like the bad links have taken twice there power away from the site, so our on site content is still good but compared to medium on site and crap links they are out ranking us. (we did use a lot of anchor text with keywords in) If this is the case if we build gooad quality review links, press releases and make them more natural not go for normal link building – articles ect would this help and has anyone ever dealt with this before and have any idea how to know what is happing and how long it might take to recover.
Algorithm Updates | | BobAnderson0 -
Which website? It's hard to get enough resources to develop both
I started my Whitby cottage website for years ago www.endeavourcottages.co.uk not knowing anything about SEO, but I now have very good rankings for the keywords on Google. I'm ranking quite well for Whitby holiday cottages on Google and I now list to other cottages in the area for my friends. The point is that I'm now starting a small letting agency and hoping to expand it now so I bought the domain http://www.whitby-holiday-cottages.co.uk/ and set up a WordPress site but it's not ranking are getting anywhere near the same amount of traffic. This domain name and website structure would be much more applicable now that there are more than one cottage to let. Whenever I get some free time I spend it doing SEO on my first website to improve its rankings and so which is the best strategy from here on. One try and develop the new cottages letting website at the expense of my main website, or forget about the new website are together and continue with my first website which was designed initially for just one cottage. It's hard to have enough resources to get both websites ranking high. Thank you very useful tips Alan Davidson
Algorithm Updates | | thepersuader0