Correct way to block search bots momentarily... HTTP 503?
-
Hi,
What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site)
How can you block the bots for like 30 mins or so?
Thanks
-
You can do that, but it is less specific on what you are actually doing with your server. The 503 and retry after lets the spiders know exactly what you are doing (no confusion). Thank you for the clever remark below.
-
Disregard mine, Clever was more... clever.. and beat me to it as well.
-
just disallow the root domain in your robots.txt file and when you're ready to let them back in edit your text file back to normal.
-
See the response here
http://moz.com/community/q/temporarily-shut-down-a-site
In short, the 503 is correct, you want to include a http header with a retry-after so it knows when to come back. Also, key to set this up on your robots.txt file as Google will key off of the status of this file. Once it sees that the robots.txt has a 503 it will wait until robots.txt shows a 200 again to then start crawling the entire site. Note that you still need to show the 503 on all pages, regardless.
Another option (that we use a lot on our larger sites) is that we have mirrored sites behind a load balancer. We will tell the load balancer to send traffic to www1,2 while we work on www3,4. When we have updated www3,4 we switch the load balancer to www3,4 and work on www1,2 and then when www1,2 are done we put them back into the mix on the load balancer. Makes it seamless for the users and for Google.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPS to HTTP in India
I had a question with reference to https protocol. We are planning to shift from HTTPs to HTTP version of the website will it provide any significant advantage in ranking better in Google. This question is specific to Google India and that too in a competitive landscape such as Finance
White Hat / Black Hat SEO | | glitterbug0 -
What are effective ways of finding people to link to my blog post?
So I spent ages creating amazing content and have loads of interest in it from my social media and people visiting my site are reading deep into it. I have so far not been able to get anyone to link to it. What am I doing wrong???
White Hat / Black Hat SEO | | Johnny_AppleSeed0 -
Why Does a Press Release Strategy Influence Search Results So Much?
Why do press release websites, and their submissions, carry so much influence with organic search results right now?
White Hat / Black Hat SEO | | steelintheair1 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0 -
Is there a way to check if your site has a Google penalty?
Is there a way to find out if your site has an over optimization penalty?
White Hat / Black Hat SEO | | RonMedlin0 -
Over optimization penalty on the way
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content. http://searchengineland.com/too-much-seo-google%e2%80%99s-working-on-an-%e2%80%9cover-optimization%e2%80%9d-penalty-for-that-115627?utm_source=feedburner&utm_medium=feed&utm_campaign=feed-main
White Hat / Black Hat SEO | | AlanMosley3 -
Drop in non-branded organic search April 1
I saw an intense drop in non-branded organic search for major pages on my site on April 1st this year. The homepage wasn't affected and it's not an annual thing. I've attached a screen shot showing the drop. I'm new to the company and recently learned that they had hired a pretty black hat company last year and I'm worried that this is Panda...although the timing seems wrong. Has anyone experienced panda effects between the two updates? I'd love to get some feedback!! 1ry2a.png
White Hat / Black Hat SEO | | CIEEwebTeam0