What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
-
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like:
staging.domain.com
User-agent: *
Disallow: /in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.
-
Just make sure that when/if you copy over the staging site to the live domain that you don't copy over the robots.txt, htaccess, or whatever means you use to block that site from being indexed and thus have your shiny new site be blocked.
-
I agree. The name of your subdomain being "staging" didn't register at all with me until Matt brought it up. I was offering a generic response to the subdomain question whereas I believe Matt focused on how to handle a staging site. Interesting viewpoint.
-
Matt/Ryan-
Great discussion, thanks for the input. The staging.domain.com is just one of the domains we don't want indexed. Some of them still need to be accessed by the public, some like staging could be restricted to specific IPs.
I realize after your discussion I probably should have used a different example of a sub-domain. On the other hand it might not have sparked the discussion so maybe it was a good example
-
.htaccess files can be placed at any directory level of a site so you can do it for just the subdomain or even just a directory of a domain.
-
Staging URL's are typically only used for testing so rather than do a deny I would recommend using a specific ALLOW for only the IP addresses that should be allowed access.
I would imagine you don't want it indexed because you don't want the rest of the world knowing about it.
You can also use HTACCESS to use username/passwords. It is simple but you can give that to clients if that is a concern/need.
-
Correct.
-
Toren, I would not recommend that solution. There is nothing to prevent Googlebot from crawling your site via almost any IP. If you found 100 IPs used by the crawler and blocked them all, there is nothing to stop the crawler from using IP #101 next month. Once the subdomain's content is located and indexed, it will be a headache fixing the issue.
The best solution is always going to be a noindex meta tag on the pages you do not wish to be indexed. If that method is too much work or otherwise undesirable, you can use the robots.txt solution. There is no circumstance I can imagine where you would modify your htaccess file to block googlebot.
-
Hi Matt.
Perhaps I misunderstood the question but I believe Toren only wishes to prevent the subdomain from being indexed. If you restrict subdomain access by IP it would prevent visitors from accessing the content which I don't believe is the goal.
-
Interesting, hadn't thought of using htaccess to block Googlebot.Thanks for the suggestion.
-
Thanks Ryan. So you don't see any issues with de-indexing the main site if I created a second robots.txt file, e.g.
http://staging.domin.com/robots.txt
User-agent: *
Disallow: /That was my initial thought but when Google announced they consider sub-domains part of the TLD I was afraid it might affect the htp://www.domain.com versions of the pages. So you're saying the subdomain is basically treated like a folder you block on the primary domain?
-
Use an .htaccess file to only allow from certain ip addresses or ranges.
Here is an article describing how: http://www.kirupa.com/html5/htaccess_tricks.htm
-
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Place a robots.txt file in the root of the subdomain.
User-agent: *
Disallow: /This method will block the subdomain while leaving your primary domain unaffected.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
This concerns f5.com, a large website with navigation menus that drop down when hovered over. The sub nav items (example: “DDoS Protection”) are not cached by Google and therefore do not distribute internal links properly to help those sub-pages rank well. Best option naturally is to change the nav menus from JS to CSS but barring that, is there another option? Will Schema SiteNavigationElement work as an alternate?
Technical SEO | | CarlLarson0 -
Domain vs Sub Domain and Rankings
Hi All Wanting some advice. I have a client which has a number of individual centres that are part of an umbrella organisation. Each individual centre has its own web site and some of these sites have similar (not duplicate content) products and services. Currently the individual centres are sub domains of the umbrella organisation. i.e. Umbrella organisation www.organisation.org.au Individual centres are sub domains i.e. www.centre1.organisation.org.au, www.centre2.organisation.org.au etc. I'm feeling that perhaps this setup might be affecting the rankings of the individual sites because they are sub domains. Would love to hear some thoughts or experience on this and whether its worth going through the process of migrating the individual centre domains. Thanks Ian
Technical SEO | | iragless0 -
Google is indexing blocked content in robots.txt
Hi,Google is indexing some URLs that i don't want to be indexed and also is indexing the same URLs with https. This URLs are blocked in the file robots.txt.I've tried to block this URLs through Google WebmasterTools but Google doesn't let me do it because this URL are httpsThe file robots.txt is correct so, what can i do to avoid this content to be indexed?
Technical SEO | | elisainteractive0 -
Country specific micro-sites or all under the .com domain?
We are working with a new client that works across a number of jurisdictions. Currently they have a number of micro-sites. Each is domained under a relevant domain i.e. .hk for Hong Kong, .ru Russia etc. There is a push to absorb these into their main .com site to make it easier to manage. Given that they want to build a local presence in these areas (including having offices there) and that these site are translated into the local language, what are the SEO implications of such a move?
Technical SEO | | Switch_Digital0 -
.ca and. com domains
Hello, currently the main site im working on is a .com, but have the .ca version purchased from register.com. should i have this setup to redirect to the .com site. will google see these as dup content. We have the .ca for our canadian customers but both sites are identical. Thank you
Technical SEO | | TP_Marketing0 -
Backlinks go to "example.com" our homepage is "example.com/default.html" am I losing internal link power?
Hey everyone! Thanks again for everybodies contributions to my questions over the last few months. As the title states, our homepage is at "example.com/default.html" but everybody that backlinks to us (as expected) to "example.com" does that mean that I am probably losing a lot of the power of my links??
Technical SEO | | TylerAbernethy0 -
Domain authority not showing on root domain?
I was going through our site earlier w/ the mozBar (still learning the tools, new here) and saw the attached image. There were far more links to the subdomain (#s on the left) than the root domain (#s on right). This is strange to me, because we are not using any subdomains. All links point to either our root domain or subfolders off our root domain. Is this hurting our ranking for the root domain? Not sure what's up with this. Zz9j0.jpg
Technical SEO | | askotzko0 -
How do you get an e.mail response from SEOMOZ?
We have sent 3 e.mails asking why all our last 5 monthly reports are identical despite us fixing masses of the original issues but we can't get a response from anyone. Can anyone suggest how I can get an answer please?
Technical SEO | | frank-2443750