Robots.txt for subdomain
-
Hi there Mozzers!
I have a subdomain with duplicate content and I'd like to remove these pages from the mighty Google index. The problem is: the website is build in Drupal and this subdomain does not have it's own robots.txt.
So I want to ask you how to disallow and noindex this subdomain. Is it possible to add this to the root robots.txt:
User-agent: *
Disallow: /subdomain.root.nl/User-agent: Googlebot
Noindex: /subdomain.root.nl/Thank you in advance!
Partouter
-
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /
As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
Not sure how your server is configured but mine is set up so that subdomain.mydomain.com is a subdirectory like this:
http://www.mydomain.com/subdomain/
in robots.txt you would simply need to put
User-agent: *
Disallow: /subdomain/Others may have a better way though.
HTH
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Tester - syntax not understood
I've looked in the robots.txt Tester and I can see 3 warnings: There is a 'syntax not understood' warning for each of these. XML Sitemaps:
Technical SEO | | JamesHancocks1
https://www.pkeducation.co.uk/post-sitemap.xml
https://www.pkeducation.co.uk/sitemap_index.xml How do I fix or reformat these to remove the warnings? Many thanks in advance.
Jim0 -
How to recover from duplicate subdomain penalty?
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days. After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago. Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty? This is the site: http://goo.gl/3DCbl Thank you!
Technical SEO | | tact0 -
Do knowledge base plugins on a subdomain have seo benefit?
Hi SEO Moz, We want to use a knowledge base plugin (cheaper & faster than building it ourselves) where we can have Q&As for our website. We want this to help with our SEO by adding in our keyterms that we want to rank for. We've looked into TenderApp & Get Satisfaction which look like good solutions - however, as they're both on a sub-domain do we get any seo benefit from this? When people link to our knowledge base, will this help our website at all - or is the benefit going to go to TenderApp/Get Satisfaction? For instance - Our website URL is http://widget.products.com The KB URL is http://widgetsupport.products.com If the above plugin is not a good solution, is there anything else that is better? Any help will be greatly appreciated!! Thanks.
Technical SEO | | qdigi0 -
Should search pages be disallowed in robots.txt?
The SEOmoz crawler picks up "search" pages on a site as having duplicate page titles, which of course they do. Does that mean I should put a "Disallow: /search" tag in my robots.txt? When I put the URL's into Google, they aren't coming up in any SERPS, so I would assume everything's ok. I try to abide by the SEOmoz crawl errors as much as possible, that's why I'm asking. Any thoughts would be helpful. Thanks!
Technical SEO | | MichaelWeisbaum0 -
Redirected Subdomain Development URLs Showing In SERPs?
I develop client websites within a subdomain of another website (with noindex, nofollow so that incomplete websites on the wrong domains aren't ever seen by web users). Then, when we launch a client's site on their own domain, we redirect all of the development URLS to the appropriate page on the new live site. (meaning at site launch, all pages on http://client-site.developersite.com would be set to 301 redirect to identical pages pages on http://www.client-site.com). This system has always seemed to work fine, but today I discovered 94,700 pages indexed by Google on my root domain and found that these were mostly old URLs of sites in development that redirect to the actual client sites. Many are several years old. Any idea why Google would be indexing these pages? Thanks in advance!
Technical SEO | | VTDesignWorks0 -
Understanding page and subdomain metrics in OSE
When using OSE to look at the **Total External Links **of a websites homepage, I dont understand why the page and subdomain metrics are so different. For example, privacy.net has 1,992 external links at the page level and 55,371 at the subdomain level. What subdomain? www redirects to privacy.net. And they have 56,982 at the root domain level - does that mean they have around 55k deep links or what?
Technical SEO | | meterdei0 -
Subdomain Setting in MozBar
I'm using the MozBar on FF6 and have a question about the selector next to "Root Domain" When I set it to "Subdomain", I notice that DmR and DmT are different from the root domain. Am I looking at DmR and DmT for all subdomains (www included)?
Technical SEO | | waynekolenchuk0 -
Subdomain Robots.txt
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain? If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up? Thanks!
Technical SEO | | JohnECF0