Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Block a sub-domain from being indexed
-
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines?
One item i cannot use is the meta "no follow" tag.
Thanks! - Kyle
-
Keep in mind that Google Index's everything that it can crawl. Even if you put a block in the robots.txt they will probably crawl it. You can require a password to that subdomain and keep big G out. This is easy to do if you have a site with cpanel access. Just go to manage permissions, and password protect that director with a .htaccess pw.
-
The robots.txt file just tells the bots you would "prefer" they don't index but there is nothing to prevent them from indexing.The only sure way to do this is to restrict access to the sub-domain for everyone and require some sort of authentication. If they don't have access they can't index.
-
In subdomain.example.com/robots.txt add the statements:
User-agent: *
Disallow: /Warning: Be absolutely certain that the above statements are not included in your example.com/robots.txt file or you'll kill your site.
-
Each subdomain may have its own robots.txt file. So for that subdomain, you can put:
User-agent: * Disallow: /In the robots.txt, and that should do it.
Please note that disallowing pages in robots.txt will not necessarily mean they won't appear on search result pages.... if people link to pages that are disallowed on that subdomain, they can still appear in SERPs. I had this happen with a few pages, which leads to funny listings in the SERPs because Google has to guess what the page title and description of the page should be, since it's not allowed to read the page. The meta noindex tag is the way to go if you want to be really sure the page doesn't appear in the SERPs. If you use that, don't disallow the page. Here's a recent SEOMoz post about it: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
-
That was going to be my assumption but i wasn't 100% sure how they worked with sub domains. Are you able to supply a little more information on implementation? It is extremely important that it only blocks: sub.domain.com and not domain.com
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Clients domain expired - rankings lost - repurchased domain - what next?
Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?
Technical SEO | | waqid0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Google is indexing my directories
I'm sure this has been asked before, but I was looking at all of Google's results for my site and I found dozens of results for directories such as: Index of /scouting/blog/wp-includes/js/swfupload/plugins Obviously I don't want those indexed. How do I prevent Google from indexing those? Also, it only seems to be doing it with Wordpress, not any of the directories on my main site. (We have a wordpress blog, which is only a portion of the site)
Technical SEO | | UnderRugSwept0 -
.ca and. com domains
Hello, currently the main site im working on is a .com, but have the .ca version purchased from register.com. should i have this setup to redirect to the .com site. will google see these as dup content. We have the .ca for our canadian customers but both sites are identical. Thank you
Technical SEO | | TP_Marketing0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0