Huge increase in server errors and robots.txt
-
Hi Moz community!
Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors.
The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone...
Anyone have any ideas here? It would be greatly appreciated! I've been chasing this up with the dev agency and the hosting company for weeks, to no avail.
Massive thanks in advance
-
Thank you EGOL, all makes perfect sense and I appreciate your reply. I suspect the problems are mostly centered on the hosting issues, with secondary potential robots.txt issues aswell.
-
....it has got to the point where people cannot access the site because of server errors.
As soon as I would see this I would go straight to someone who knows a lot more about servers than I do. I would start with the host and if I get no help from them within a few hours then I would get someone who knows about servers to dig into this and be ready to quickly move the website to a new host. If the host does not know how to solve it, and I don't know how to solve it. Then it is time for bigger guns and possibly a new host - right away.
....they have just told us they moved the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again.
So, the website is now in the hands of a new host. It is likely that the problem will be solved here it the old host was the cause of the problem. Today, DNS propagates quickly, I am having my morning coffee... if I don't see progress by the time I return from lunch then I am calling a pro.
I think that it is a good idea for anyone who has clients or an important website to have a person or a company that they can call straightaway for a quick couple of hours of investigation or advice. Two hours of consulting is cheaper than seeing a business throttled for two days.
Also, I have learned to stay away from hosts who offer unlimited bandwidth and similar claims. When you start to become successful you become unprofitable for them so they either have to limit your resources or confess that their claim of unlimited is an absolute lie.
All of my sites are with hosts who charge me for every bit of resource that I use. The more I use, the more money they make and when I have a problem they are motivated to get it fixed immediately - because when my biz is dragging they are making less money. They want me to make money because our interests are in alignment - not the opposite.
Cheap hosts are just as bad as the unlimited guys. If they have a problem with your website it is cheaper to let you go and lose the few bucks a month that you are paying them than it is to pay their staff to fix things. (But they will not tell you to go to a new host - they will just allow you to have crap service until you decided to move.) I make sure that the hosts that I use have a number of high profile sites under their care who will not tolerate one minute of BS. These hosts are not cheap, but I am not interested in cheap, I want reliable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
What do you add to your robots.txt on your ecommerce sites?
We're looking at expanding our robots.txt, we currently don't have the ability to noindex/nofollow. We're thinking about adding the following: Checkout Basket Then possibly: Price Theme Sortby other misc filters. What do you include?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Mobile Googlebot vs Desktop Googlebot - GWT reports - Crawl errors
Hi Everyone, I have a very specific SEO question. I am doing a site audit and one of the crawl reports is showing tons of 404's for the "smartphone" bot and with very recent crawl dates. If our website is responsive, and we do not have a mobile version of the website I do not understand why the desktop report version has tons of 404's and yet the smartphone does not. I think I am not understanding something conceptually. I think it has something to do with this little message in the Mobile crawl report. "Errors that occurred only when your site was crawled by Googlebot (errors didn't appear for desktop)." If I understand correctly, the "smartphone" report will only show URL's that are not on the desktop report. Is this correct?
Intermediate & Advanced SEO | | Carla_Dawson0 -
meta robots no follow on page for paid links
Hi I have a page containing paid links. i would like to add no follow attribute to these links
Intermediate & Advanced SEO | | Kung_fu_Panda
but from technical reasons, i can only place meta robots no follow on page level (
is that enough for telling Google that the links in this page are paid and and to prevent Google penlizling the sites that the page link to? Thanks!0 -
Huge drop in rankings for specialist life insurance site
I work with a site who specialise in life insurance for people with pre-existing medical conditions - http://goo.gl/Drwre6. The site has ranked really well historically, but was hit hard on 16th June when we saw an almost 100% drop in rankings overnight. We picked up on quite a few issues straight away and rectified these. A list of steps we've taken so far are below: removed CSS & JS files from robots.txt changed hosting provider back, as it had recently been moved somewhere new updated copy on main landing pages to remove small amounts that were duplicated requested removal of some suspicious looking backlinks and submitted a disavow found and removed a test site that was live and indexable found an external site that had scraped copy from our site - requested removal (this site is no longer live) cleaned up any 404 and ensured all redirects are working correctly updated the diabetes page to include more valuable info - including linking out to authority sites After taking all these steps, we have still seen no improvement. It could be that Google just hasn't yet re-crawled the site to take the changes into account...? We're aware of one other site in our industry that has noticed a drop in rankings in the last couple of months, but a number of our competitors are still ranking well for our target terms. We wonder if the site was caught up in the Payday Loans update, as the timings almost line up. Other sites with spammy medical content seem to have been hit, so we wonder if the "medical" type content on our site could have been penalised? Incredibly frustrating if so, as it's a valid, genuine service being offered! Really at a bit of a loss as to what to do next, so any help would be hugely appreciated! Katie
Intermediate & Advanced SEO | | Digirank0 -
Robots.txt
What would be a perfect robots.txt file my site is propdental.es Can i just place: User-agent: * Or should i write something more???
Intermediate & Advanced SEO | | maestrosonrisas0 -
Can I make 301 redirects on a Windows server (without access to IIS)?
Hey everyone, I've been trying to figure out a way to set up some 301 redirects to handle the broken links left behind after a site restructuring, but I can only ever find information on 2 methods that I can't use (as far as I can tell). The first method is to do some stuff with an htaccess file, but that looks like it only works on Linux-based servers. The method described for Windows servers is generally to install this IIS rewrite/redirect module and run that, but I don't think our web hosting company allows users to log directly into the server, so I wouldn't be able to use the IIS thing. Is there any other way to get a 301 redirect set up? And is this uncommon for a web hosting company to do, or do you all just run your sites on Linux-based servers or your own Windows machines? Thanks!
Intermediate & Advanced SEO | | BrianAlpert780 -
301 redirect or Robots.txt on an interstatial page
Hey guys, I have an affiliate tracking system that works like this : an affiliate puts up a certain code on his site, for example : www.domain.com/track/aff_id This url leads to a page where the hit is counted, analysed and then 302 redirects to my sales page with the affiliates ID in the url : www.mysalespage.com/?=aff_id. However, we've noticed recently that one affiliate seems to be ranking for our own name and the url google indexed was his tracking url (domain.com/track/aff_id). Which is strange because there is absolutely nothing on that page, its just an interstatial page so that our stats tracking software can properly filter hits. To remove the affiliate's url from showing up in the serps, I've come up with 2 solutions : 1 - Change the redirect to a 301 redirect on his track page. 2 - Change our robots.txt page to block all domain.com/track/ pages from being indexed. My question is : if I 301 redirect instead of 302, will I keep the affiliates from outranking me for my own name AND pass on link juice or should I simply block google from crawling the interstatial tracking pages?
Intermediate & Advanced SEO | | CrakJason0