Blocking Subdomain from Google Crawl and Index
-
Hey everybody, how is it going?
I have a simple question, that i need answered.
I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more.
What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc?
Hope to hear from you,
Best Regards,
-
Hello George, Thank you for fast answer! I read that article and there is some issue with that. if you can see at it, i'd really appreciate it. So the problem is that if i do it directly from Tumblr, it will also block it from Tumblr users. Here is the note right below that option "Allow this blog to appear in search results":
"This applies to searches on Tumblr as well as external search engines, like Google or Yahoo."Also, if i do it from GWT, i'm very concerned to remove URLs with my subdomain because i afraid it will remove all my domain. For example, my domain is abc.com and the Tumblr blog is setup on tumblr.abc.com. So i afraid if i remove tumblr.abc.com from index, it will also remove my abc.com. Please let me know what you think.
Thank you!
-
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
-
Hi guys, I read your conversation. I have similar issue but my situation is slightly different. I'll really appreciate if you can help with this. So i have also a subdomain that i don't want to be indexed by Google. However, that subdomain is not in my control. I mean, i created subdomain on my hosting but it is pointing to my Tumblr blog. So i don't have access to its robot txt. So can anybody advise what can i do in this situation to noindex that subdomain?
Thanks
-
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
-
It would also be smart to add the subdomains in Webmaster Tools in case one does get indexed and you need to remove it.
-
Robots.txt is easiest and quickest way. As a back up you can use the Noindex meta tag on the pages in the subdomain
-
2 ways to do it with different effects:
-
Robots.txt in each subdomain. This will entirely block any search engine to even access those pages, so they won't know what they have inside.
User-Agent:*
Disallow: /
-
noindex tags in those pages. This method allows crawlers to read the page and maybe index (if you set a "follow") the pages to which you link to.or "nofollow" if you don't want the linked pages to be indexed either.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No index vs removal - Subdomains
One of my clients has a subdomain - docushare.***.edu (vs ***.edu) that they would like to not influence SEO.
On-Page Optimization | | Crescent_Sense
the question is: should they no -index these pages or remove as a subdomain? Thank You! jeremy0 -
When do Panda ranking factors apply when Google deindexes a page
Here is 2 scenarios Scenario 1 Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them. Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty. Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed? Scenario 2 I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off? What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty? The pages are useful to my users so I need them to stay. Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban? Hope this makes sense
On-Page Optimization | | cbielich0 -
What is the fastest way to re-index an important page?
Hello Moz Community Members, Besides submitting the URL in Google Webmaster, what are other ways to make sure google indexes/crawls a page which was noindexed?
On-Page Optimization | | SEMEnthusiast0 -
CMS program that have indexing problems
I am working on optimizing a website that was built with an CMS program. I searched for what pages are indexed on Google using "site:url" command. For some reason none of the pages are indexed on Google. What is the best way to index these web pages?
On-Page Optimization | | webdynamics0 -
Best Way to Add Google Map to Site
My site is built with WordPress. I want to add our business location in the contact page using Google Map. What is the best way to do this? There is many plugins out there for Word Press do I need to use one ? Hopw about Webmaster tools, does it need to be submit somewhere in there? When it is done properly, will Google show our store location with a map in Goolgle search results? I saw some store result showing the map & business hours, How to do this? Thank you for your help. BigBlaze
On-Page Optimization | | BigBlaze2050 -
Changing Subfolder that has been crawled before
Question: I am using a wordpress multisite and I enabled the crawl options yesterday www.abc.com/subfolder <-original but i find that www.abc.com/sub is good enough I checked the site:abc.com but I find that my pages in the /subfolder has been crawled before. Can I just change it to www.abc.com/sub or it will raise duplicate content issue?
On-Page Optimization | | joony20080 -
What URL Should I use in Google Place Page?
Alright, I have a client that has 1 website and 14 locations. We want to create place pages for each of their locations but my question is which URL should I put in the place page and why? I can put in the root domain into each place page, or should I put in the URL that lands on the actual location on the root. example: domain.com/location1 Thanks!
On-Page Optimization | | tcseopro0 -
On my site, www.myagingfolks.com, only a small number of my pages appear to be indexed by google or yahoo. Is that due to not having an XML sitemap, keywords, or some other problem?
On my site, www.myagingfolks.com, only a small number of my pages appear to be indexed by google or yahoo. I have thousands of pages! Is that due to not having an XML sitemap, keywords, or some other problem?
On-Page Optimization | | Jordanrg0