Correct way to block search bots momentarily... HTTP 503?
-
Hi,
What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site)
How can you block the bots for like 30 mins or so?
Thanks
-
You can do that, but it is less specific on what you are actually doing with your server. The 503 and retry after lets the spiders know exactly what you are doing (no confusion). Thank you for the clever remark below.
-
Disregard mine, Clever was more... clever.. and beat me to it as well.
-
just disallow the root domain in your robots.txt file and when you're ready to let them back in edit your text file back to normal.
-
See the response here
http://moz.com/community/q/temporarily-shut-down-a-site
In short, the 503 is correct, you want to include a http header with a retry-after so it knows when to come back. Also, key to set this up on your robots.txt file as Google will key off of the status of this file. Once it sees that the robots.txt has a 503 it will wait until robots.txt shows a 200 again to then start crawling the entire site. Note that you still need to show the 503 on all pages, regardless.
Another option (that we use a lot on our larger sites) is that we have mirrored sites behind a load balancer. We will tell the load balancer to send traffic to www1,2 while we work on www3,4. When we have updated www3,4 we switch the load balancer to www3,4 and work on www1,2 and then when www1,2 are done we put them back into the mix on the load balancer. Makes it seamless for the users and for Google.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lost backlinks following switch from http to https
I have a client who appears to have taken a big hit in a few areas recently: MOZ Domain Authority has dropped from 16 to 1 In ahrefs, their http version has 103 backlinks from 46 referring domains, but the https version shows 'no data' for backlinks or referring domains Their 'average position' in SERPs has fallen from around 32 to 43 in the last six weeks Ininitally, I thought this might be due to the MOZ indexing problems last month. However, I now suspect this is connected to their switch from http to https, which occured in mid December. Although all the http pages appear to be redirecting, it looks like the backlinks are not being associated to their https version. Anyone had experience of this and/or now how to remedy?
White Hat / Black Hat SEO | | muzzmoz0 -
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
A Branded Local Search Strategy utilizing Microsites?
Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches. The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern. However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS). They are currently utilizing different domains, on different IPs, on different servers, etc. We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy. The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here. Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?
White Hat / Black Hat SEO | | AaronHenry1 -
Why Does a Press Release Strategy Influence Search Results So Much?
Why do press release websites, and their submissions, carry so much influence with organic search results right now?
White Hat / Black Hat SEO | | steelintheair1 -
What are the best ways of improving our domain authority?
My site's domain authority has gone down by a few points recently. What the best ways of increasing it? It's currently 29 out of 100. What is a good domain authority number?
White Hat / Black Hat SEO | | Saunders18650 -
Is there a way to check if your site has a Google penalty?
Is there a way to find out if your site has an over optimization penalty?
White Hat / Black Hat SEO | | RonMedlin0 -
Indexing search results
One of our competitors indexes all searches performed by users on their site. They automatically create new pages/ new urls based on those search terms. Is it black hat technique? Do search engines specifically forbid this?
White Hat / Black Hat SEO | | AEM131