Huge increase in server errors and robots.txt
-
Hi Moz community!
Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors.
The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone...
Anyone have any ideas here? It would be greatly appreciated! I've been chasing this up with the dev agency and the hosting company for weeks, to no avail.
Massive thanks in advance
-
Thank you EGOL, all makes perfect sense and I appreciate your reply. I suspect the problems are mostly centered on the hosting issues, with secondary potential robots.txt issues aswell.
-
....it has got to the point where people cannot access the site because of server errors.
As soon as I would see this I would go straight to someone who knows a lot more about servers than I do. I would start with the host and if I get no help from them within a few hours then I would get someone who knows about servers to dig into this and be ready to quickly move the website to a new host. If the host does not know how to solve it, and I don't know how to solve it. Then it is time for bigger guns and possibly a new host - right away.
....they have just told us they moved the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
So, the website is now in the hands of a new host. It is likely that the problem will be solved here it the old host was the cause of the problem. Today, DNS propagates quickly, I am having my morning coffee... if I don't see progress by the time I return from lunch then I am calling a pro.
I think that it is a good idea for anyone who has clients or an important website to have a person or a company that they can call straightaway for a quick couple of hours of investigation or advice. Two hours of consulting is cheaper than seeing a business throttled for two days.
Also, I have learned to stay away from hosts who offer unlimited bandwidth and similar claims. When you start to become successful you become unprofitable for them so they either have to limit your resources or confess that their claim of unlimited is an absolute lie.
All of my sites are with hosts who charge me for every bit of resource that I use. The more I use, the more money they make and when I have a problem they are motivated to get it fixed immediately - because when my biz is dragging they are making less money. They want me to make money because our interests are in alignment - not the opposite.
Cheap hosts are just as bad as the unlimited guys. If they have a problem with your website it is cheaper to let you go and lose the few bucks a month that you are paying them than it is to pay their staff to fix things. (But they will not tell you to go to a new host - they will just allow you to have crap service until you decided to move.) I make sure that the hosts that I use have a number of high profile sites under their care who will not tolerate one minute of BS. These hosts are not cheap, but I am not interested in cheap, I want reliable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about Syntax in Robots.txt
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
Intermediate & Advanced SEO | | DRSearchEngOpt
Disallow: /attachment_id Where "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Should all pages on a site be included in either your sitemap or robots.txt?
I don't have any specific scenario here but just curious as I come across sites fairly often that have, for example, 20,000 pages but only 1,000 in their sitemap. If they only think 1,000 of their URL's are ones that they want included in their sitemap and indexed, should the others be excluded using robots.txt or a page level exclusion? Is there a point to having pages that are included in neither and leaving it up to Google to decide?
Intermediate & Advanced SEO | | RossFruin1 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Increasing Search Queries
Recently I had a drop in the over all number of search queries my website was ranking for (about 50%) on October 5th. I did not lose rankings for my target keywords. How can I regain these lost opportunities?
Intermediate & Advanced SEO | | raph39880 -
Custom Error and page not found responses
When there is a 500 Internal Server Error, is it better to return an HTTP 500 response and custom error page from the requested URL, or is it better to return a 302 redirect? The redirect would send the browser to the custom error page, which would return the HTTP 500 result. We tell Google not to index or follow our error pages, so if Google sees an error at a URL, we don't necessarily want Google to think that the URL should be ignored. That's why the alternative would be to redirect to a custom error page with it's own URL. Similarly, what's the best approach if the response is a 404? Return HTTP 404 and custom 404 page from the requested URL, or redirect? Thanks.
Intermediate & Advanced SEO | | dbuckles0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0 -
How do I set up a 301 redirect if the default settings for our web servers create multiple URLs for the same page?
How do I set up a 301 redirect if the default settings for our web servers create multiple URLs for the same page but only views it as one page?
Intermediate & Advanced SEO | | ibex0