What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
-
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like:
staging.domain.com
User-agent: *
Disallow: /in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.
-
Just make sure that when/if you copy over the staging site to the live domain that you don't copy over the robots.txt, htaccess, or whatever means you use to block that site from being indexed and thus have your shiny new site be blocked.
-
I agree. The name of your subdomain being "staging" didn't register at all with me until Matt brought it up. I was offering a generic response to the subdomain question whereas I believe Matt focused on how to handle a staging site. Interesting viewpoint.
-
Matt/Ryan-
Great discussion, thanks for the input. The staging.domain.com is just one of the domains we don't want indexed. Some of them still need to be accessed by the public, some like staging could be restricted to specific IPs.
I realize after your discussion I probably should have used a different example of a sub-domain. On the other hand it might not have sparked the discussion so maybe it was a good example
-
.htaccess files can be placed at any directory level of a site so you can do it for just the subdomain or even just a directory of a domain.
-
Staging URL's are typically only used for testing so rather than do a deny I would recommend using a specific ALLOW for only the IP addresses that should be allowed access.
I would imagine you don't want it indexed because you don't want the rest of the world knowing about it.
You can also use HTACCESS to use username/passwords. It is simple but you can give that to clients if that is a concern/need.
-
Correct.
-
Toren, I would not recommend that solution. There is nothing to prevent Googlebot from crawling your site via almost any IP. If you found 100 IPs used by the crawler and blocked them all, there is nothing to stop the crawler from using IP #101 next month. Once the subdomain's content is located and indexed, it will be a headache fixing the issue.
The best solution is always going to be a noindex meta tag on the pages you do not wish to be indexed. If that method is too much work or otherwise undesirable, you can use the robots.txt solution. There is no circumstance I can imagine where you would modify your htaccess file to block googlebot.
-
Hi Matt.
Perhaps I misunderstood the question but I believe Toren only wishes to prevent the subdomain from being indexed. If you restrict subdomain access by IP it would prevent visitors from accessing the content which I don't believe is the goal.
-
Interesting, hadn't thought of using htaccess to block Googlebot.Thanks for the suggestion.
-
Thanks Ryan. So you don't see any issues with de-indexing the main site if I created a second robots.txt file, e.g.
http://staging.domin.com/robots.txt
User-agent: *
Disallow: /That was my initial thought but when Google announced they consider sub-domains part of the TLD I was afraid it might affect the htp://www.domain.com versions of the pages. So you're saying the subdomain is basically treated like a folder you block on the primary domain?
-
Use an .htaccess file to only allow from certain ip addresses or ranges.
Here is an article describing how: http://www.kirupa.com/html5/htaccess_tricks.htm
-
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Place a robots.txt file in the root of the subdomain.
User-agent: *
Disallow: /This method will block the subdomain while leaving your primary domain unaffected.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Www to non www on a .com/blog url
hi guys, I have had to reset my site from www to non-www. via htacces and this worked out just fine.However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. Does anyone have an idea as to what I need to do to force the non-www in a folder installed blog http://5starweddingdirectory.com/ http://www.5starweddingdirectory.com/blog/ Regards T
Technical SEO | | Taiger0 -
What's the best way to pass link juice to a page on another domain?
I'm working with a non-profit, and their donation form software forces them to host their donation pages on a different domain. I want to attempt to get their donation page to appear in their sitelinks in Google (under the main website's entry), but it seems like the organization's donation forms are at a disadvantage because they're not actually hosted on that site. I know that no matter what I do, there's no way to "force" a sitelink to appear the way I want it, but... I was trying to think if there's a way I can work around this. Do you think 1) creating a url like orgname.org/donate and having that be a 301 redirect to the donation form, and 2) using the /donate redirect all over the site (instead of linking directly to the form) would help? Are there alternatives other folks recommend?
Technical SEO | | clefevre0 -
Multiple sub domain appearing
Hi Everyone, Hope were well!. Have a strange one!!. New clients website http://www.allsee-tech.com. Just found out he is appearing for every subdomain possible. a.alsee-tech.com b.allsee-tech.com. I have requested htaccess as this is where I think the issue lies but he advises there isn't anything out of place there. Any ideas in case it isn't? Regards Neil
Technical SEO | | nezona0 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
How to prevent duplicat content issue and indexing sub domain [ CDN sub domain]?
Hello! I wish to use CDN server to optimize my page loading time ( MaxCDN). I have to use a custom CDN sub domain to use these services. If I added a sub domain, then my blog has two URL (http://www.example.com and http://cdn.example.com) for the same content. I have more than 450 blog posts. I think it will cause duplicate content issues. In this situation, what is the best method (rel=canonical or no-indexing) to prevent duplicate content issue and prevent indexing sub domain? And take the optimum service of the CDN. Thanks!
Technical SEO | | Godad0 -
I cannot find a way to implement to the 2 Link method as shown in this post: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
Did Google stop offering the 2 link method of verification for Authorship? See this post below: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218 And see this: http://www.seomoz.org/blog/using-passive-link-building-to-build-links-with-no-budget In both articles the authors talk about how to set up Authorship snippets for posts on blogs where they have no bio page and no email verification just by linking directly from the content to their Google+ profile and then by linking the from the the Google+ profile page (in the Contributor to section) to the blog home page. But this does not work no matter how many ways I trie it. Did Google stop offering this method?
Technical SEO | | jeff.interactive0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
My website pages (www.ommrudraksha.com) is getting good rank slowly. But no good sales ?
My website has been doing good slowly. I have been using seomoz recommendations. And it is a great help to see that my pages are slowly coming to the first page. I am also running PPC on google. I see there are many visitors to my website. But i do not get good conversion - or not getting customer buying products. My website : www.ommrudraksha.com My target keyword is : rudraksha
Technical SEO | | Ommrudraksha0