Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Subdomain from Google Crawl and Index
-
Hey everybody, how is it going?
I have a simple question, that i need answered.
I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more.
What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc?
Hope to hear from you,
Best Regards,
-
Hello George, Thank you for fast answer! I read that article and there is some issue with that. if you can see at it, i'd really appreciate it. So the problem is that if i do it directly from Tumblr, it will also block it from Tumblr users. Here is the note right below that option "Allow this blog to appear in search results":
"This applies to searches on Tumblr as well as external search engines, like Google or Yahoo."Also, if i do it from GWT, i'm very concerned to remove URLs with my subdomain because i afraid it will remove all my domain. For example, my domain is abc.com and the Tumblr blog is setup on tumblr.abc.com. So i afraid if i remove tumblr.abc.com from index, it will also remove my abc.com. Please let me know what you think.
Thank you!
-
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
-
Hi guys, I read your conversation. I have similar issue but my situation is slightly different. I'll really appreciate if you can help with this. So i have also a subdomain that i don't want to be indexed by Google. However, that subdomain is not in my control. I mean, i created subdomain on my hosting but it is pointing to my Tumblr blog. So i don't have access to its robot txt. So can anybody advise what can i do in this situation to noindex that subdomain?
Thanks
-
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
-
It would also be smart to add the subdomains in Webmaster Tools in case one does get indexed and you need to remove it.
-
Robots.txt is easiest and quickest way. As a back up you can use the Noindex meta tag on the pages in the subdomain
-
2 ways to do it with different effects:
-
Robots.txt in each subdomain. This will entirely block any search engine to even access those pages, so they won't know what they have inside.
User-Agent:*
Disallow: /
-
noindex tags in those pages. This method allows crawlers to read the page and maybe index (if you set a "follow") the pages to which you link to.or "nofollow" if you don't want the linked pages to be indexed either.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are Wildcard Subdomain Hurting my SEO?
I have some sites with a lot of categories (category, sub-category, sub-subcategory) and locations (country, state/territory, city). To avoid listing pages really deep in my hierarchy I used wildcard subdomains for the locations, but lately I have been told that might be hurting my overall SEO efforts. I have a lot of URLs like https://city-state-country.example.com on one side of the domain and example.com/category/subcategory/subsubcategory on the other. In the middle you see stuff like city-state-country.example.com/category/subcategory/subsubcategory and everything in between. Would I be better off moving the locations to the right side of the domain name? Then you might find stuff like example.com/country/state/city/category/subcategory/subsubcategory and everything in between. I think I could do the new rewrite rules fairly easily since every country slug is just two characters long.
On-Page Optimization | | PostAlmostAnything0 -
Does Google avoid indexing pages that include registered trademark signs?
I am suspecting that Google often hesitates to index pages that have registered trademarks on them that are marked with a ®. For example EGOL® used in the title tag or in the tag at the top of the page. Registered trademarks are everywhere and most retail product pages contain at least one of them. However, most people use the registered trademark names as text in their writing without adding the registered trademark sign of ®. Have you experienced a problem getting such pages indexed or have you read any articles about how Google treats registered trademarks?
On-Page Optimization | | EGOL0 -
How does Google handle read more tags in Wordpress
Hi Everyone I am wondering how Google handles the read more tag in Wordpress. I pasted the link to a blog post on Google and found nothing (domain.com/post#readmore). Then I paste the version without #readmore (domain.com/post) and found that Google indexed the page but with the option to click "read more" to read it. The full blog post is not in their index, just the version asking you to read more. Is this because Google hasn't gotten to it or is Google ignoring it. I am not sure but ideally I rather have the full blog post indexed, not the read more version. I am curious to whether this will cause duplicate content issues. What are your experience with this and is it advisable to use an alternate method for read more. Maybe with a Wordpress plugin. Thanks in advance.
On-Page Optimization | | gaben0 -
NOINDEX, FOLLOW on product page - how about images indexing?
Hi, Since we have a lot of similar products with duplicate descriptions, I decided to NOINDEX, FOLLOW most of these different variants which have duplicate content. However, I guess it would be useful in marketing terms if Google image search still listed the images of the products in image search. How does the image search of Google actually work - does it read the NOINDEX on the product page and therefore skip the image also or is the image search completely dependent on the ALT tag of any image found on our site? Thanks!
On-Page Optimization | | speedbird12290 -
Why does Google pick a low priority page on my site?
Hi Guys. One of my pages ranks quite well for "mid year diaries 14-15" on Google. The problem is it's a really specific product page (A4, Hardback, day-to-a-page diary I think). It would be much better for the user to land on our mid-year diaries category, not really deep into the site. Why is Google prioritizing this product page over our general 'mid year diaries' category? Especially when the category would relate to the search more accurately? I work for TOAD diaries and I think our page rank is 10 for this search. Eagerly awaiting some insight 🙂 Thanks in advance everyone! Isaac.
On-Page Optimization | | isaac6630 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
Does Google index dynamically generated content/headers, etc.?
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name> We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages. My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content? The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL? Any best practices we should know about?
On-Page Optimization | | editabletext0 -
Is giving away something for a Google Review bad?
I have a friend whose client is giving away something for free if you leave a Google Review for his site. I recall that being not well liked by Google and could potentially end up in a Penalty. The site is ranking really poorly in Google but well in Yahoo/Bing so I am wondering if that is what happened. What are you opinions?
On-Page Optimization | | netviper0