What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
-
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like:
staging.domain.com
User-agent: *
Disallow: /in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.
-
Just make sure that when/if you copy over the staging site to the live domain that you don't copy over the robots.txt, htaccess, or whatever means you use to block that site from being indexed and thus have your shiny new site be blocked.
-
I agree. The name of your subdomain being "staging" didn't register at all with me until Matt brought it up. I was offering a generic response to the subdomain question whereas I believe Matt focused on how to handle a staging site. Interesting viewpoint.
-
Matt/Ryan-
Great discussion, thanks for the input. The staging.domain.com is just one of the domains we don't want indexed. Some of them still need to be accessed by the public, some like staging could be restricted to specific IPs.
I realize after your discussion I probably should have used a different example of a sub-domain. On the other hand it might not have sparked the discussion so maybe it was a good example
-
.htaccess files can be placed at any directory level of a site so you can do it for just the subdomain or even just a directory of a domain.
-
Staging URL's are typically only used for testing so rather than do a deny I would recommend using a specific ALLOW for only the IP addresses that should be allowed access.
I would imagine you don't want it indexed because you don't want the rest of the world knowing about it.
You can also use HTACCESS to use username/passwords. It is simple but you can give that to clients if that is a concern/need.
-
Correct.
-
Toren, I would not recommend that solution. There is nothing to prevent Googlebot from crawling your site via almost any IP. If you found 100 IPs used by the crawler and blocked them all, there is nothing to stop the crawler from using IP #101 next month. Once the subdomain's content is located and indexed, it will be a headache fixing the issue.
The best solution is always going to be a noindex meta tag on the pages you do not wish to be indexed. If that method is too much work or otherwise undesirable, you can use the robots.txt solution. There is no circumstance I can imagine where you would modify your htaccess file to block googlebot.
-
Hi Matt.
Perhaps I misunderstood the question but I believe Toren only wishes to prevent the subdomain from being indexed. If you restrict subdomain access by IP it would prevent visitors from accessing the content which I don't believe is the goal.
-
Interesting, hadn't thought of using htaccess to block Googlebot.Thanks for the suggestion.
-
Thanks Ryan. So you don't see any issues with de-indexing the main site if I created a second robots.txt file, e.g.
http://staging.domin.com/robots.txt
User-agent: *
Disallow: /That was my initial thought but when Google announced they consider sub-domains part of the TLD I was afraid it might affect the htp://www.domain.com versions of the pages. So you're saying the subdomain is basically treated like a folder you block on the primary domain?
-
Use an .htaccess file to only allow from certain ip addresses or ranges.
Here is an article describing how: http://www.kirupa.com/html5/htaccess_tricks.htm
-
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Place a robots.txt file in the root of the subdomain.
User-agent: *
Disallow: /This method will block the subdomain while leaving your primary domain unaffected.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are my webpages not getting indexed?
I want to figure out why a lot of my pages for my website are not getting indexed by google. I have installed the SEO plugin by Yoast to my wordpress website. Under the titles and meta section of the plugin options I have set categories and tags to noindex. In WMT, google is saying that all my category pages and most of my tag pages are not being indexed. I want to make sure that the reason these pages are not being indexed are because of the SEO plugin. I want to prevent duplicate content so that is the reason I have set my categories and tags to noindex. Please respond if you know the absolute answer, its very important that I have my website indexed the proper way I want it to.
Technical SEO | | Dino640 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Multiple sub domain appearing
Hi Everyone, Hope were well!. Have a strange one!!. New clients website http://www.allsee-tech.com. Just found out he is appearing for every subdomain possible. a.alsee-tech.com b.allsee-tech.com. I have requested htaccess as this is where I think the issue lies but he advises there isn't anything out of place there. Any ideas in case it isn't? Regards Neil
Technical SEO | | nezona0 -
What is the best practice to re-index the de-indexed pages due to a bad migration
Dear Mozers, We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months. Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302). I need to pass all the value from old URLs to new URLs. Example URL Structure
Technical SEO | | riyas_
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google) After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-version Does creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts. Riyas0 -
Getting images indexed in the SERPS
Good Afternoon form 13 degrees C totally Sunny Wetherby UK 🙂 Am i right in thinking that the only way to get images appearing like this in your serps: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/innovia-merchant-immages-serpscopy.jpg is to be hooked up to Google Merchant? Which kind of means if the sight your working on has no images then this type of enhancement is out of bounds? Thanks in advance, David
Technical SEO | | Nightwing0 -
Getting multiple errors for domain.com/xxxx/xxxx/feed/feed/feed/feed...
A recent SEOMoz crawl report is showing a bunch 404's and duplicate page content on pages with urls like http://domain.com/categories/about/feed/feed/feed/feed/feed and on and on. This is a wordpress install. Does anyone know what could be causing this or why SEOMoz would be trying to read these non-existent feed pages?
Technical SEO | | Brandtailers0 -
Moving domain.com to subdomain.domain.com
Hi, I like to do something, but first like to take some opinions from seomoz. My question is: 1. I have a domain: brandtrends.com and i like to move this from brandtrends.com to trends.brand.com because: brandtrends.com is on the position #15 on the second page of SERP for my "brand" keyword. I like to move it under trends.brand.com but all inbound-links are @brandtrends.com What do you thing, if i move permanently 301 from brandtrends.com to trends.brand.com does it rank under brand.com on the 1st page of SERP..? like to rank trends.brand.com under brand.com on results page...!!! I have the backlinks of brandtrends.com on my hands too, should i leave the inbound links @brandtrends.com and the new ones i build with trends.brand.com or should i change the inbound links from brandtrends.com to trends.brand.com Hope you got it! THanks
Technical SEO | | leadsprofi0 -
How to see a theme ‘/wp-content/themes/’
HI I'm still plugging away at getting to grips with my companies personalized blog. I've been trying for the past two days to upload a theme to my own test Wordpress blog, in order to correct a bug in the companies theme that makes formatting in the Post disappear. The code in the themes CSS file seems to be fine. Anyhow what I assumed would be a simple step has given me hours of hassle. I have finally got to the point of uploading an unzipped version of the theme intot ‘/wp-content/themes/’. Now try as I might my Wordpress admin is completely blind to the fact. Any attempt at using the Upload facility (which is what I attempted many hours ago) fails. There seems to be no place to say, look out there at my directory - a new original theme - unzipped and ready to go. Am I missing something very obvious?
Technical SEO | | catherine-2793880