Blocking Subdomain from Google Crawl and Index
-
Hey everybody, how is it going?
I have a simple question, that i need answered.
I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more.
What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc?
Hope to hear from you,
Best Regards,
-
Hello George, Thank you for fast answer! I read that article and there is some issue with that. if you can see at it, i'd really appreciate it. So the problem is that if i do it directly from Tumblr, it will also block it from Tumblr users. Here is the note right below that option "Allow this blog to appear in search results":
"This applies to searches on Tumblr as well as external search engines, like Google or Yahoo."Also, if i do it from GWT, i'm very concerned to remove URLs with my subdomain because i afraid it will remove all my domain. For example, my domain is abc.com and the Tumblr blog is setup on tumblr.abc.com. So i afraid if i remove tumblr.abc.com from index, it will also remove my abc.com. Please let me know what you think.
Thank you!
-
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
-
Hi guys, I read your conversation. I have similar issue but my situation is slightly different. I'll really appreciate if you can help with this. So i have also a subdomain that i don't want to be indexed by Google. However, that subdomain is not in my control. I mean, i created subdomain on my hosting but it is pointing to my Tumblr blog. So i don't have access to its robot txt. So can anybody advise what can i do in this situation to noindex that subdomain?
Thanks
-
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
-
It would also be smart to add the subdomains in Webmaster Tools in case one does get indexed and you need to remove it.
-
Robots.txt is easiest and quickest way. As a back up you can use the Noindex meta tag on the pages in the subdomain
-
2 ways to do it with different effects:
-
Robots.txt in each subdomain. This will entirely block any search engine to even access those pages, so they won't know what they have inside.
User-Agent:*
Disallow: /
-
noindex tags in those pages. This method allows crawlers to read the page and maybe index (if you set a "follow") the pages to which you link to.or "nofollow" if you don't want the linked pages to be indexed either.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
403 Forbidden Crawl report
Hi, I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
On-Page Optimization | | ghrisa65
Some of the links:
https://www.medistaff24.co.uk/contact-us/ https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/ https://www.medistaff24.co.uk/hourly-home-care-in-evesham/0 -
Index or No Index (Panda Issue)
Hi, I believe our website has been penalized by the panda update. We have over 9000 pages and we are currently indexing around 4,000 of those pages. I believe that more than half of the pages indexes have either thin content. Should we stop indexing those pages until we have quality page content? That will leave us with very few pages being indexed by Google (Roughly 1,000 of our 9,000 pages have quality content). I am worried that we would hurt our organic traffic more by not indexing the pages than by indexing the pages for google to read. Any help would be greatly appreciated. Thanks, Jim Rodriguez
On-Page Optimization | | dustyabe0 -
How to remove subdomains in a clean way?
Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards!
On-Page Optimization | | Gaolga0 -
Does PLA hurt my organic google search.
I have experienced a big drop in my organic trafic to my site. But an increase in my cpc...? Does google punish my site for PLA to get higher revenue?
On-Page Optimization | | Egmont0 -
Crawl with cach problem
Hello, My Crawl results in Seomoz shows me that i have few thousands of 302 direct problem, bucause it was crawling links like http://www.sposae.com/abito-sposa-g2026-pr-347.html?action=buy_now and it would be redirected automatically to http://www.sposae.com/cookie_usage.php because of cookie not activated from the user. Now I'm wondering if this is an issue to be solved or just a minor thing without impact. Thanks
On-Page Optimization | | angelowei0 -
What are the best practices for google indexing ajax response urls?
We just did a build to our site and our server erros went up to over 9,500. After looking into it, it seems like google is crawling the ajax urls and coming back with the errors. Here is one example http://www.rockymountainatvmc.com/productDetail.do?navType=type&vehicleId=1423&webTypeId=68&navTitle=Drive&webCatId=9&prodFamilyId=29660 If you know of any good articles on this please send them my way.
On-Page Optimization | | DoRM0 -
Google Fonts & Site Speed
Hello, Does the use of one google font slow down a website enough to effect load speed and thus rankings? Here's the ones we're choosing from: www.google.com/webfonts How do we know if the one we choose is too slow? Thank you.
On-Page Optimization | | BobGW0 -
Crawl Diagnostics not working?
i've been in the crawl diagnostics of my website. I only have 1 page crawled and no errors. Last Crawl Completed: Jul. 12th, 2012 Next Crawl Starts: Jul. 19th, 2012 Do you have an idea of how to fiw it? Thanks a lot
On-Page Optimization | | Ericc220