Huge increase in server errors and robots.txt
-
Hi Moz community!
Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors.
The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone...
Anyone have any ideas here? It would be greatly appreciated! I've been chasing this up with the dev agency and the hosting company for weeks, to no avail.
Massive thanks in advance
-
Thank you EGOL, all makes perfect sense and I appreciate your reply. I suspect the problems are mostly centered on the hosting issues, with secondary potential robots.txt issues aswell.
-
....it has got to the point where people cannot access the site because of server errors.
As soon as I would see this I would go straight to someone who knows a lot more about servers than I do. I would start with the host and if I get no help from them within a few hours then I would get someone who knows about servers to dig into this and be ready to quickly move the website to a new host. If the host does not know how to solve it, and I don't know how to solve it. Then it is time for bigger guns and possibly a new host - right away.
....they have just told us they moved the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
So, the website is now in the hands of a new host. It is likely that the problem will be solved here it the old host was the cause of the problem. Today, DNS propagates quickly, I am having my morning coffee... if I don't see progress by the time I return from lunch then I am calling a pro.
I think that it is a good idea for anyone who has clients or an important website to have a person or a company that they can call straightaway for a quick couple of hours of investigation or advice. Two hours of consulting is cheaper than seeing a business throttled for two days.
Also, I have learned to stay away from hosts who offer unlimited bandwidth and similar claims. When you start to become successful you become unprofitable for them so they either have to limit your resources or confess that their claim of unlimited is an absolute lie.
All of my sites are with hosts who charge me for every bit of resource that I use. The more I use, the more money they make and when I have a problem they are motivated to get it fixed immediately - because when my biz is dragging they are making less money. They want me to make money because our interests are in alignment - not the opposite.
Cheap hosts are just as bad as the unlimited guys. If they have a problem with your website it is cheaper to let you go and lose the few bucks a month that you are paying them than it is to pay their staff to fix things. (But they will not tell you to go to a new host - they will just allow you to have crap service until you decided to move.) I make sure that the hosts that I use have a number of high profile sites under their care who will not tolerate one minute of BS. These hosts are not cheap, but I am not interested in cheap, I want reliable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify: AggregateRating Schema Error
Hi lovely community, I know google made some schema changes in Sept 2019. I got an AggregateRating Error:
Intermediate & Advanced SEO | | Insightful_Media
One of offers or review or aggregateRating should be provided. I am using a third-party app 'Shopify Product Review' to implement the rating. What I should do to solve this error. Thanks very much for the help! I found many people have this issue too in the community! Many thanks Pui0 -
Huge Search Traffic Drop After Switching to HTTPS - No Recovery After Couple of Months
Hi In November, we have switched our website (https://www.insidermonkey.com) from HTTP to HTTPS. Initially, we noticed slight search traffic loss but later discovered it might be due to HTTPS switch. A month later we added the https version at search console, and then saw an immediate huge drop (about 25-30%). We discovered the problem might be due to poor redirection and noticed our redirects were 302s instead of 301s. To fix the problem, we implemented the 301 redirects and submitted the sitemap containing links to the old site at the new search console property (https). We've gone through points listed on the page below: https://support.google.com/webmasters/answer/6073543 We fixed the redirects to 301 Double-checked the sitemaps Made sure we had a properly installed SSL certificate (Now, we get A+ from https://www.ssllabs.com/ssltest/analyze.html?d=www.insidermonkey.com) Made sure we have no mixed-content errors (we don't have any issues at search console.) We only avoided implementing HSTS, in case we might want to switch back to HTTP.
Intermediate & Advanced SEO | | etakgoz
We had a small improvement in the following month, but our traffic did not fully recover. We wanted to test for the possibility to switch back HTTP by switching only 2 articles in our CMS to HTTP. Our traffic got worse, not only for those but for the whole site. Then we switched back those 2 articles to HTTPS again and implemented HSTS. It seems our search traffic getting worse day by day with no sign of improving. In the link below you can find the screenshot of our weekly search traffic between 1 October - 1 March. We are down from 500K weekly visitors to mere 167K last week. https://drive.google.com/open?id=1Y1TQbj_YtGG4NhLORbEWbvITUkGKUa0G Any ideas or suggestions? We are willing to get professional help as well. What is the way to find a proper consultant for such problem with relevant experience?0 -
How to Handle a Soft 404 error to an admin page in WordPress
I'm seeing this error on Google Webmaster Console: | URL: | http://www.awlwildlife.com/wp-admin/admin-ajax.php | | | Error details | Linked from | | |
Intermediate & Advanced SEO | | aj613
| Last crawled: 11/15/16First detected: 11/15/16 The target URL doesn't exist, but your server is not returning a 404 (file not found) error. Learn more Your server returns a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404). This creates a poor experience for searchers and search engines. More information about "soft 404" errors | Any ideas what I should do about it? Thanks!0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Server responds with 302 but the pages doesn't appear to redirect?
I'm working on a site and am running some basic audits, including a campaign within Moz. When I put the domain into any of these tools, including response header checkers, the response is a 302 that says there is a redirect to an Error Page. However, the page itself doesn't redirect, and resolves fine in the browser. But all of the audit tools cant seem to get any information from any of the pages. What is the best way to troubleshoot what is going on here? Thanks.
Intermediate & Advanced SEO | | jim_shook0 -
Google webmaster Smartphone errors fix
I have certain URL's that I have fixed before in google webmaster. With smartphone addition. It start appearing again. How can I fix the Google webmaster errrors for smartphones?
Intermediate & Advanced SEO | | csfarnsworth0 -
Robots.txt: Can you put a /* wildcard in the middle of a URL?
We have noticed that Google is indexing the language/country directory versions of directories we have disallowed in our robots.txt. For example: Disallow: /images/ is blocked just fine However, once you add our /en/uk/ directory in front of it, there are dozens of pages indexed. The question is: Can I put a wildcard in the middle of the string, ex. /en/*/images/, or do I need to list out every single country for every language in the robots file. Anyone know of any workarounds?
Intermediate & Advanced SEO | | IHSwebsite0 -
Can the template increase the loading time of the site?
Hi, My site was built with WordPress. Very recently I had it redesigned. The problem is that now it takes a long time to download. I have spoken with a web designer who checked my site and said that after it was rebuilt, the template that was created included a lot of hard coding. Can this be the reason why my site now takes a long time to load? The hard coding factor? Thank you for your help. Sal P.S.: FYI the site only has a few plug-ins and the server is a good one.
Intermediate & Advanced SEO | | salvyy0