Correct way to block search bots momentarily... HTTP 503?
-
Hi,
What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site)
How can you block the bots for like 30 mins or so?
Thanks
-
You can do that, but it is less specific on what you are actually doing with your server. The 503 and retry after lets the spiders know exactly what you are doing (no confusion). Thank you for the clever remark below.
-
Disregard mine, Clever was more... clever.. and beat me to it as well.
-
just disallow the root domain in your robots.txt file and when you're ready to let them back in edit your text file back to normal.
-
See the response here
http://moz.com/community/q/temporarily-shut-down-a-site
In short, the 503 is correct, you want to include a http header with a retry-after so it knows when to come back. Also, key to set this up on your robots.txt file as Google will key off of the status of this file. Once it sees that the robots.txt has a 503 it will wait until robots.txt shows a 200 again to then start crawling the entire site. Note that you still need to show the 503 on all pages, regardless.
Another option (that we use a lot on our larger sites) is that we have mirrored sites behind a load balancer. We will tell the load balancer to send traffic to www1,2 while we work on www3,4. When we have updated www3,4 we switch the load balancer to www3,4 and work on www1,2 and then when www1,2 are done we put them back into the mix on the load balancer. Makes it seamless for the users and for Google.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Incorrectly Identifies WordPress Version and Recommends Update
Howdy, Moz fans, Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible." This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3. What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email. Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly. Thanks in advance!
White Hat / Black Hat SEO | | jmorehouse0 -
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
The purpose of these Algo updates: To more harshly push eCommerce sites toward PPC and enable normal blogs/forums toward reclaiming organic search positions?
Hi everyone, This is my first post here, and absolutely loving the site and the services. Just a quick background, I have dabbled in SEO in the past, and have been reading up over the last few months and am amazed at the speed at which things are changing. I currently have a few clients that I am doing some SEO work for 2 of them, and have had an ecommerce site enquire about SEO services. They are a medium sized oak furniture ecommerce site. From all the major changes..the devaluing of spam links, link networks, penalization of overuse of exact match anchor text and the overall encouraging of earned links (often via content marketing) over built links, adding to this the (not provided) section in Google Analytics, and the increasing screen real estate that PPC is getting over organic search...all points to me thinking on major thing..... That the search engine is trying to push eCommerce sites and sites that sell stuff harder toward using PPC and paid advertising and allowing the blogs/forums and informational sites to more easily reclaim the organic part of the search results again. The above is elaborated on a bit more below.. POINT 1 Firstly as built links (article submission, press releases, info graphic submission, web 2.0 link building ect) rapidly lose their effectiveness, and as Google starts to place more emphasis on sites earning links instead - by producing amazing interesting and unique content that people want to link to. The fact remains that surely Google is aware that it is much harder for eCommerce sites to produce a constant stream of interesting link worthy content around their niche (especially if its a niche that not an awful lot could be written about). Although earning links is not impossible for eCommerce sites, for a lot of them it is more difficult because creating link worthy content is not what eCommerce sites were originally intended for. Whereas standard blogs and forums were built for that exact purpose. Therefore the search engines must know that it is a lot easier for normal blogs/forums to "earn" links through content, therefore leading to them reclaiming more of the organic search ranking for transaction and non transaction terms, and therefore forcing the eCommerce sites to adopt PPC more heavily. POINT 2 If we add to the mix the fact that for the terms most relevant to eCommerce sites, the search engine results page has a larger allocation of PPC ads than organic results (above the fold), and that Google has limited the amount of data that sites can see in terms of which keywords people are using to arrive on their sites, which effects eCommerce sites more - as it makes it harder for them to see which keywords are resulting in sales. Then this provides further evidence that Google is trying to back eCommerce sites into a corner by making it more difficult for them to make sense of and track sales from organic results in comparison to with PPC, where data is still plentiful. Conclusion Are the above just over exaggerations? Can most eCommerce sites still keep achieving a good percentage of sales from organic search despite the above? if so, what do the more niche eCommerce sites do to "earn" links when content topics are thin and unique outreach destinations can be exhausted quickly. Do they accept the fact that the are in the business of selling things, so should be paying for their traffic as opposed to normal blogs/forums which are not. Or is there still a place for them to get even more creative with content and acquire earned links..? And finally, is the concentration on earned links more overplayed than it actually is? Id really appreciate your thoughts on this..
White Hat / Black Hat SEO | | sanj50500 -
IS http://ezinearticles.com/ good or bad for backlinks?
Hi Everyone, Is http://ezinearticles.com/ any good to use? Thanks
White Hat / Black Hat SEO | | vanplus0 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Yahoo Slurp Bot 3.0 Going Crazy
On one of our sites, since the Summer, Yahoo Slurp bot has been crawling our pages at about 5 times a minute. We have put a crawl delay on it and it does not respect our robots.txt. Now the issue is it's triggering javascript (which bots shouldn't) triggering our adsense, ad server, analytics information, etc. We've thought of banning the bot all together but get a good amount of Yahoo traffic. We've though about programmatic-ly not showing the javascript (ad + analytic) tags but are slightly afraid the Yahoo might consider this cloaking. What are the best practices to deal with this bad bot.
White Hat / Black Hat SEO | | tony-755340