Huge increase in server errors and robots.txt
-
Hi Moz community!
Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors.
The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone...
Anyone have any ideas here? It would be greatly appreciated! I've been chasing this up with the dev agency and the hosting company for weeks, to no avail.
Massive thanks in advance
-
Thank you EGOL, all makes perfect sense and I appreciate your reply. I suspect the problems are mostly centered on the hosting issues, with secondary potential robots.txt issues aswell.
-
....it has got to the point where people cannot access the site because of server errors.
As soon as I would see this I would go straight to someone who knows a lot more about servers than I do. I would start with the host and if I get no help from them within a few hours then I would get someone who knows about servers to dig into this and be ready to quickly move the website to a new host. If the host does not know how to solve it, and I don't know how to solve it. Then it is time for bigger guns and possibly a new host - right away.
....they have just told us they moved the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
So, the website is now in the hands of a new host. It is likely that the problem will be solved here it the old host was the cause of the problem. Today, DNS propagates quickly, I am having my morning coffee... if I don't see progress by the time I return from lunch then I am calling a pro.
I think that it is a good idea for anyone who has clients or an important website to have a person or a company that they can call straightaway for a quick couple of hours of investigation or advice. Two hours of consulting is cheaper than seeing a business throttled for two days.
Also, I have learned to stay away from hosts who offer unlimited bandwidth and similar claims. When you start to become successful you become unprofitable for them so they either have to limit your resources or confess that their claim of unlimited is an absolute lie.
All of my sites are with hosts who charge me for every bit of resource that I use. The more I use, the more money they make and when I have a problem they are motivated to get it fixed immediately - because when my biz is dragging they are making less money. They want me to make money because our interests are in alignment - not the opposite.
Cheap hosts are just as bad as the unlimited guys. If they have a problem with your website it is cheaper to let you go and lose the few bucks a month that you are paying them than it is to pay their staff to fix things. (But they will not tell you to go to a new host - they will just allow you to have crap service until you decided to move.) I make sure that the hosts that I use have a number of high profile sites under their care who will not tolerate one minute of BS. These hosts are not cheap, but I am not interested in cheap, I want reliable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
How to increase Page authority of old blog posts
Hi, How can I increase the page authority of old blog posts? There are many posts that are ranking well (Page 1 lower or Page 2) - but I want to make them rank higher by making the post more usable, better UI, design, content relaunches etc - these all would inherently mean improving Page authority also eventually. What are some concrete steps I can take to improve page authority of blog pages?
Intermediate & Advanced SEO | | pks3331 -
Productontology URLs are 404 erroring, are there alternatives to denote new schema categories?
Our team QA specialist recently noticing that the class identifier URLs via productontology are 404ing out saying that the "There is no Wikipedia article for (particular property)". They are even 404ing for productontology URLs that are examples on the productontology.com website! Example: http://www.productontology.org/id/Apple The 404 page says that the wiki entry for "Apple" doesn't exist (lol) Does anybody know what is going on with this website? This service was extremely helpful for creating additionalType categories for schema categories that don't exist on schema.org. Are there any alternatives to productontology now that these class identifier URLs are 404ing? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Should You Use 301 Redirects When Switching To A Secure SSL Server?
Hi, our client has switched from a non-secure server to a secure (SSL) server.. but the non secure pages still exist, i.e. http://www.stainlesshandrailsystems.co.uk/balustrade-systems.html (non-secure)
Intermediate & Advanced SEO | | Webpresence
https://www.stainlesshandrailsystems.co.uk/balustrade-systems.html (secure) We assumed that we should 301 redirect the http pages to the new https pages using the following htaccess rule; RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.yoursite.com/$1 [R,L] HOWEVER! both of the above pages show the same Page Authority (PA) and Pagerank (PR).. does this mean that they are being seen as the same page, do we really need to employ 301 redirects? Many thanks in advance, much appreciated. 🙂 Lee1 -
Do you add 404 page into robot file or just add no index tag?
Hi, got different opinion on this so i wanted to double check with your comment is. We've got /404.html page and I was wondering if you would add this page to robot text so it wouldn't be indexed or would you just add no index tag? What would be the best approach? Thanks!
Intermediate & Advanced SEO | | Rubix0 -
Wordpress error
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading: User-agent: *
Intermediate & Advanced SEO | | NileCruises
Crawl-delay: 20 User-agent: 008
Disallow: / I'm wondering how I can fix this and stop it happening again. The site was hacked about 4 months ago but I thought we'd managed to clear things up. Colin0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0