When rankings dip what's the best diagnostic procedure?
-
Bonjourno from 10 degrees C lighly raining Wetherby UK
Every so often SEO feels like a game of snakes & ladders. One minute your rankings go up and then then within the click of a mouse they drop back down. Like a Greek play you begin to feel our mortal lives as SEO pundits is controlled by the Google Gods.
A case in point is illustrated here in this graph:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/lincoln-drop_zpseeb04690.jpgNow if i want to explain why the rapid dip has occured for target term "Lincoln Solicitors" here's is what i'd do:
1. Go to webmaster tools and check for crawl errors
2. See if a Google algo change has changed the rules of engagment
3. Check another site administrator hasnt tinkered with the original layout
But i wonder what process do other SEO practitioners follow to explain to a disgruntled client - "Why have my rankings that i pay you to look after nose dived?"
Any insights welcome:-)
-
I would check where those 500-errors originate from. Your website does not handle errors well do - i.e. the link to "/About-Us/Partner-Profiles/Partner-Profiles/Anna-Mosey.aspx" throws a 500 and should really be 404's or 410's.
When I do the search (from South Africa), the search term is on page 1, 4th position.
I would perhaps have a look at validating HTML as well - found it quite strange that the anchor-texts in the have so much trailing whitespace.
-
Well, firstly we can do a simple check against dates of known algorithm updates and see if that matches the drop.
So, you have a good rank on 17th July and have dropped at the next rank check on the 26th July.
Panda 3.9 hit on July 24th so there is every chance the site was flagged in this update so that would be my first port of call to see if this seems a likely case.
It's very hard without a link, but start with the dates and if you find something that seems like it could be the case then review the page to see if it is weak, or a near internal duplicate or some such.
I have checked the 14th result for that search term and the page is pretty weak, I am not sure if this is your client but if it is (first two letters of url are www.bm) then these are very weak pages with little to no unique content beyond the address so this is a page created pretty much entirely for search engines so is typical panda fodder.
So, to resolve? Make this page better along with other similar ones and if that is the case, then it should help resolve matters.
Hope that helps!
Marcus -
N.B - Looked into webmanster tools and found this:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/server-errors-langleys_zps10c62870.jpgWould i be right in suggesting this has played a significant part in ranking demotion?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam URL'S in search results
We built a new website for a client. When I do 'site:clientswebsite.com' in Google it shows some of the real, recently submitted pages. But it also shows many pages of spam url results, like this 'clientswebsite.com/gockumamaso/22753.htm' - all of which then go to the sites 404 page. They have page titles and meta descriptions in Chinese or Japanese too. Some of the urls are of real pages, and link to the correct page, despite having the same Chinese page titles and descriptions in the SERPS. When I went to remove all the spammy urls in Search Console (it only allowed me to temporarily hide them), a whole load of new ones popped up in the SERPS after a day or two. The site files itself are all fine, with no errors in the server logs. All the usual stuff...robots.txt, sitemap etc seems ok and the proper pages have all been requested for indexing and are slowly appearing. The spammy ones continue though. What is going on and how can I fix it?
Technical SEO | | Digital-Murph0 -
Need advice for new site's structure
Hi everyone, I need to update the structure of my site www.chedonna.it Basicly I've two main problems: 1. I've 61.000 index tag (more with no post)2. The category of my site are noindex I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time. Mybe it is correct just to make the category index and linking it from the post and leave the tag index. Could you please let me know what's your opinion? Regards.
Technical SEO | | salvyy0 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40 -
What's the best way to handle Overly Dynamic Url's?
So my question is What the best way to handle Overly Dynamic Url's. I am working on a real estate agency website. They are selling/buying properties and the url is as followed. ttp://www.------.com/index.php?action=calculator&popup=yes&price=195000
Technical SEO | | Angelos_Savvaidis0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0 -
Domain Transfer Process / Bulk 301's Using IIS
Hi guys - I am getting ready to do a complete domain transfer from one domain to another completely different domain for a client due to a branding/name change. 2 things - first, I wanted to lay out a summary of my process and see if everyone agrees that its a good approach, and second, my client is using IIS, so I wanted to see if anyone out there knows a bulk tool that can be used to implement 301's on the hundreds of pages that the site contains? I have found the process to redirect each individual page, but over hundreds its a daunting task to look at. The nice thing about the domain transfer is that it is going to be a literal 1:1 transfer, with the only things changing being the logo and the name mentions. Everything else is going to stay exactly the same, for the most part. I will use dummy domain names in the explanation to keep things easy to follow: www.old-domain.com and www.new-domain.com. The client's existing home page has a 5/10 GPR, so of course, transferring Mojo is very important. The process: Clean up existing site 404's, duplicate tags and titles, etc. (good time to clean house). Create identical domain structure tree, changing all URL's (for instance) from www.old-domain.com/freestuff to www.newdomain.com/freestuff. Push several pages to a dev environment to test (dev.new-domain.com). Also, replace all instances of old brand name (images and text) with new brand name. Set up 301 redirects (here is where my IIS question comes in below). Each page will be set up to redirect to the new permanent destination with a 301. TEST a few. Choose lowest traffic time of week (from analytics data) to make the transfer ALL AT ONCE, including pushing new content live to the server for www.new-domain.com and implementing the 301's. As opposed to moving over parts of the site in chunks, moving the site over in one swoop avoids potential duplicate content issues, since the content on the new domain is essentially exactly the same as the old domain. Of course, all of the steps so far would apply to the existing sub-domains as well, IE video.new-domain.com. Check for errors and problems with resolution issues. Check again. Check again. Write to (as many as possible) link partners and inform them of new domain and ask links to be switched (for existing links) and updated (for future links) to the new domain. Even though 301's will redirect link juice, the actual link to the new domain page without the redirect is preferred. Track rank of targeted keywords, overall domain importance and GPR over time to ensure that you re-establish your Mojo quickly. That's it! Ok, so everyone, please give me your feedback on that process!! Secondly, as you can see in the middle of that process, the "implement 301's" section seems easier said than done, especially when you are redirecting each page individually (would take days). So, the question here is, does anyone know of a way to implement bulk 301's for each individual page using IIS? From what I understand, in an Apache environment .htaccess can be used, but I really have not been able to find any info regarding how to do this in bulk using IIS. Any help here would be GREATLY APPRECIATED!!
Technical SEO | | Bandicoot0 -
How to handle URL's from removed products?
Hi All, I have a question about a fashion related webshop. Every month about 100 articles are removed and about the some amouth is added to the site. Most of the products are indexed on brandname and type (e.g. MyBrand t-shirt blue) My question is what to do with the URL / page after the product is removed. I'm thinking about a couple of solutions: 301 the page to the brand categorie page build a script which shows related articles on the old URL (and try to keep it indexed) 404 page optimized for search term with links to brand category any other suggestons? Thanks in advance, Sam
Technical SEO | | U-Digital0 -
How do I 301 url's with numbers in them?
I have a number of 404 error pages showing in webmaster tools and some of the url's have numbers, % symbols, and some are pdf's. My usual 301 redirect in my htaccess file does NOT redirect these pages where the url's have special characters. What am I doing wrong?
Technical SEO | | BradBorst0