Block a sub-domain from being indexed
-
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines?
One item i cannot use is the meta "no follow" tag.
Thanks! - Kyle
-
Keep in mind that Google Index's everything that it can crawl. Even if you put a block in the robots.txt they will probably crawl it. You can require a password to that subdomain and keep big G out. This is easy to do if you have a site with cpanel access. Just go to manage permissions, and password protect that director with a .htaccess pw.
-
The robots.txt file just tells the bots you would "prefer" they don't index but there is nothing to prevent them from indexing.The only sure way to do this is to restrict access to the sub-domain for everyone and require some sort of authentication. If they don't have access they can't index.
-
In subdomain.example.com/robots.txt add the statements:
User-agent: *
Disallow: /Warning: Be absolutely certain that the above statements are not included in your example.com/robots.txt file or you'll kill your site.
-
Each subdomain may have its own robots.txt file. So for that subdomain, you can put:
User-agent: * Disallow: /
In the robots.txt, and that should do it.
Please note that disallowing pages in robots.txt will not necessarily mean they won't appear on search result pages.... if people link to pages that are disallowed on that subdomain, they can still appear in SERPs. I had this happen with a few pages, which leads to funny listings in the SERPs because Google has to guess what the page title and description of the page should be, since it's not allowed to read the page. The meta noindex tag is the way to go if you want to be really sure the page doesn't appear in the SERPs. If you use that, don't disallow the page. Here's a recent SEOMoz post about it: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
-
That was going to be my assumption but i wasn't 100% sure how they worked with sub domains. Are you able to supply a little more information on implementation? It is extremely important that it only blocks: sub.domain.com and not domain.com
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
URL Indexing with Keyword
Hi, My webpage url is indexed in Google but don't show when searching the Main Keyword. How can i index it with keyword. It should show on any SERP when the keyword is searched. Any suggestions.
Technical SEO | | green.h1 -
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
No Index PDFs
Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!
Technical SEO | | MonicaOConnor0 -
Domain vs Sub Domain and Rankings
Hi All Wanting some advice. I have a client which has a number of individual centres that are part of an umbrella organisation. Each individual centre has its own web site and some of these sites have similar (not duplicate content) products and services. Currently the individual centres are sub domains of the umbrella organisation. i.e. Umbrella organisation www.organisation.org.au Individual centres are sub domains i.e. www.centre1.organisation.org.au, www.centre2.organisation.org.au etc. I'm feeling that perhaps this setup might be affecting the rankings of the individual sites because they are sub domains. Would love to hear some thoughts or experience on this and whether its worth going through the process of migrating the individual centre domains. Thanks Ian
Technical SEO | | iragless0 -
Why is Google not indexing my site?
I'm a bit confused as to why my site just isn't indexing on Google. Even if I type in my brand name, my social channels rank and there's no evidence of my website. I've followed all of the advice I've read and gone into webmaster tools and got the Wordpress yoast plug-in but nothing seems to be making a difference!One thing I've noticed, in Google Webmaster Tools it says "Couldn’t communicate with the DNS server." in site errors. I've called GoDaddy and they said that everything is fine. A bit frustrating. Trying to work out what my next steps should be but feeling a bit lost to be honest! Any help GREATLY appreciated!
Technical SEO | | j1066s0 -
Checkout on different domain
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate. Thanks!
Technical SEO | | PrintingForLess.com0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0