Subdomain Robots.txt
-
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain?
If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up?
Thanks!
-
Thanks Wissam. I was thinking this was the way to go, and I appreciate your input.
I do use the Yoast SEO plugin for Wordpress on another site, but the blog in question is through BlogEngine. I will do what you have suggested.
Cheers!
-
if the url is http://blog.website.com
then the Robots.txt should be accessable threw http://blog.website.com/robots.txt
I would suggest these steps
- Verify your blog the Google webmaster tools
- generate a robots .txt file with Google webmaster tools
- Upload it to the Subdomain.
There is another way if you are using Wordpress.
There is a All in One SEO plugin / Wordpress SEO by Yoast. threw the settings you can specify to add NOINDEX to all Category, tags, author and others. its faster and error free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
Link from Blogspot.com subdomain...
I have found an author that has an article about a particular product we sell online. I was thinking of speaking to them about getting a link to our site. But then I looked at the stats: <label>Page:</label><label id="Page Authority" class="key lsdata">PA:1</label><label id="mozrank" class="key lsdata" title="MozRank">mR:0.00</label>mT:0.00<label id="SEOmoz-data-uid">0</label> links from
Technical SEO | | bjs2010
<label id="SEOmoz-data-uipl">0</label> Root Domains<label>Root Domain:</label>**<label id="dom-pageauthority" class="key lsdata" title="Domain Authority">DA: 59</label>**24,797,212 links from
<label id="SEOmoz-data-pid">110,858</label> Domains<label>Subdomain:</label>Its on a subdomain of blogspot.com - and the page is relevant to a particular category and our e-commerce site.Is it worth pursuing the link?Thanks!0 -
Robots.txt
Google Webmaster Tools say our website's have low-quality pages, so we have created a robots.txt file and listed all URL’s that we want to remove from Google index. Is this enough for the solve problem?
Technical SEO | | iskq0 -
Subdomains
Hello Seo Experts, Can any one help me with this issue... I do have issues with my subdomains, My site name is http://www.bharatdesi.com, should I have subdomain http://www.bharatdesi.com/hyderabad this way or hyderabad.bharatdesi.com. Please any can answer my question, which way I have to organize my subdomains... and also give me some examples.. Thank you.
Technical SEO | | Vinss0 -
Will errors on a subdomain effect the overall health of the root domain?
As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ. The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google. Your comments will be appreciated. Regards Greg
Technical SEO | | AndreVanKets0 -
Subdomain Setting in MozBar
I'm using the MozBar on FF6 and have a question about the selector next to "Root Domain" When I set it to "Subdomain", I notice that DmR and DmT are different from the root domain. Am I looking at DmR and DmT for all subdomains (www included)?
Technical SEO | | waynekolenchuk0 -
Robots.txt
Hi everyone, I just want to check something. If you have this entered into your robots.txt file: User-agent: *
Technical SEO | | PeterM22
Disallow: /fred/ This wouldn't block /fred-review/ from being crawled would it? Thanks0