Blocking Subdomain from Google Crawl and Index
-
Hey everybody, how is it going?
I have a simple question, that i need answered.
I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more.
What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc?
Hope to hear from you,
Best Regards,
-
Hello George, Thank you for fast answer! I read that article and there is some issue with that. if you can see at it, i'd really appreciate it. So the problem is that if i do it directly from Tumblr, it will also block it from Tumblr users. Here is the note right below that option "Allow this blog to appear in search results":
"This applies to searches on Tumblr as well as external search engines, like Google or Yahoo."Also, if i do it from GWT, i'm very concerned to remove URLs with my subdomain because i afraid it will remove all my domain. For example, my domain is abc.com and the Tumblr blog is setup on tumblr.abc.com. So i afraid if i remove tumblr.abc.com from index, it will also remove my abc.com. Please let me know what you think.
Thank you!
-
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
-
Hi guys, I read your conversation. I have similar issue but my situation is slightly different. I'll really appreciate if you can help with this. So i have also a subdomain that i don't want to be indexed by Google. However, that subdomain is not in my control. I mean, i created subdomain on my hosting but it is pointing to my Tumblr blog. So i don't have access to its robot txt. So can anybody advise what can i do in this situation to noindex that subdomain?
Thanks
-
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
-
It would also be smart to add the subdomains in Webmaster Tools in case one does get indexed and you need to remove it.
-
Robots.txt is easiest and quickest way. As a back up you can use the Noindex meta tag on the pages in the subdomain
-
2 ways to do it with different effects:
-
Robots.txt in each subdomain. This will entirely block any search engine to even access those pages, so they won't know what they have inside.
User-Agent:*
Disallow: /
-
noindex tags in those pages. This method allows crawlers to read the page and maybe index (if you set a "follow") the pages to which you link to.or "nofollow" if you don't want the linked pages to be indexed either.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I check spam score and non indexed links
Hi I am working on a project in which I have a local services website <a target="_self">www.thepcrrg.com</a> I want to be sure its links are getting indexed or not. Please help me.
On-Page Optimization | | Jioiebytt2 -
Removing non www and index.php
Hi, I'm green when it comes to altering the htaccess file to remove non www and index.php. I think I've managed to redirect the urls to www however not sure if I've managed to remove the index.php. I'm pasting the contents of the htaccess file here maybe someone can identify if I have unwanted lines of code and if it is up to standard (there are a lot of comments in #) not sure if needed but I've left them as I don't want to screw up anything. Thanks 🙂 @package Joomla @copyright Copyright (C) 2005 - 2016 Open Source Matters. All rights reserved. @license GNU General Public License version 2 or later; see LICENSE.txt READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! The line 'Options +FollowSymLinks' may cause problems with some server configurations. It is required for the use of mod_rewrite, but it may have already been set by your server administrator in a way that disallows changing it in this .htaccess file. If using it causes your site to produce an error, comment it out (add # to the beginning of the line), reload your site in your browser and test your sef urls. If they work, then it has been set by your server administrator and you do not need to set it here. No directory listings IndexIgnore * Can be commented out if causes errors, see notes above. Options +FollowSymlinks
On-Page Optimization | | KeithBugeja
Options -Indexes Mod_rewrite in use. RewriteEngine On
RewriteCond %{REQUEST_URI} ^/index.php/
RewriteRule ^index.php/(.*) /$1 [R,L] Begin - Rewrite rules to block out some common exploits. If you experience problems on your site then comment out the operations listed below by adding a # to the beginning of the line. This attempts to block the most common type of exploit attempts on Joomla! Block any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR] Block any script that includes a0 -
Google search result dramatically dropped with drop in DA.
It looks like on 11/13 by site traffic dropped by like 75% and it just happens to coincide with the MOZ DA dropping to. Anyone else see this?
On-Page Optimization | | Motom70 -
How the hell do you get microformat to show up on google serp?
Preface: I implemented Microformat aggregate review (http://data-vocabulary.org/Review-aggregate) for our e-commerce website and included only on the homepage. The vote and count are actually coming from real reviews we are getting from our customers, and in the homepage some reviews are shown prominently and a link points to the full list of all the reviews. Microformat markup is correct, validated in GWT. Have been online for a while (probably a couple of years). Our website: http://www.gomme-auto.it The star rating never showed up. When checking competitors I could see their microformats where not showing up either. But now things changed, if I check one competitor (the market leader www.gommadiretto.it) searching for it with their brand name “gommadiretto” no star rating is showing, but if I search for tires of a specific manufactured like “pneumatici barum” I can see their result in serp is showing the star rating for that specific internal page (the brand page) where they simply put the website overall aggregate review microformat mark up, they actually put it on every page. And that make me scratch my head and start asking myself some questions: is google showing their microformats because they manually awarded them somehow? no other competitor seems to have got the star rating in serp is google showing their microformats because they have so much more reviews than I have? I have around 1700, they have around 11000. is google showing their microformats because their reviews are certified by TrustPilot? is google showing their microformats because they put it in the product page? well of course since I am not putting it there (in the brand page) it's a factor, but isn't it recommended to put the website aggregate reviews microformat only on one page? and shouldn't we show the brand reviews on the brand page? isn't it best practice/recommended to put the website aggregate review microformat only on one page? is google showing their microformats because of some other reasons I can't see? What the hell is google criteria for showing the star rating? Does anyone know?
On-Page Optimization | | max.favilli0 -
Google index text that I can not find
Hello everybody, As you can see here: http://webcache.googleusercontent.com/search?q=cache:G-iicHoDJeYJ:www.billigste-internet.dk/&hl=da&gl=dk&strip=1 Google index the text "Forside" as the H1 tag, and "Right" and "Left" as body text, on my website. But I do not want that Google indexes this. But when I look in mine source code (see here: view-source:http://www.billigste-internet.dk/) I can not find "Forside", "rigth" or "Left", so I can delete it. Is there anyone who can help me where I need to delete the text "Forside", "Right" and "Left", so Google does not index this text? Hope someone can help.
On-Page Optimization | | JoLindahl910 -
404 errors in wordpress... Pages have never existed so why is google trying to crawl them?
I've just logged into webmaster tools and have over 100 404 errors. I'm running wordpress and I recently added child pages to 2 of my categories like so. www.mydomain.com/category1/lincolnshire www.mydomain.com/category1/cambridgeshire etc... The 404 errors though are for pages or categories I've never created though. I have over 20 root categories but decided to test adding child pages to only two of them. The 404 errors are for www.mydomain.com/category5/cambridgeshire .... It seem that gogle has tried to crawl these pages that don't exist. Can anyone explain what's going on? When I click 'linked from' in webmaster tools it's showing links from pages on my site that don't exist also.
On-Page Optimization | | SamCUK0 -
Google Authorship v Product Rich Snippets
Hi Folks, So ,we have a website that we have fully configured with Product Rich Snippets (its an e-Commerce Store), including product image and the usual. We are considering verifying the Authorship of the website for the client as they will have an extensive blog. My question is, if we verify the Authorship of the website, which will Google use in terms of the rich snippet photograph in the SERP, will it use the product image as detailed in the Product Rich Snippet or us the company logo we have on the Google Plus page (Verified Author) Or is it a case we only add the rel:author tag to pages without products. Would just like to verify before we continue on as Thanks in advance John
On-Page Optimization | | Johnny4B0 -
Google Reconsideration
Our site fell from grace last July and landed on page five of the Google search results for our primary keyword. For 6 months I tried a number of strategies with no results, including reconfiguring our site based on the SOEmoz on-page grading tool. More recently, after receiving your advice in a Q&A, I took down all of my paid links and submitted a reconsideration request to Google. Interestingly, 3 days later we popped up 20 spots. This left us on the top of page three. Better than page 5, but still not prime time! A few days ago (two weeks after our reconsideration request was submitted) I got a message back in my Webmaster Tools, that they had completed a review of our site - but oddly enough they provided no info on the outcome, positive or negative. And there has been no additional movement in the rankings since I received the message. Was the original 20 point jump the result of the reconsideration request, or just a coincidence? Or, is it possible that they did a review and the results will only occur later during some organic re-indexing process? What do you think?
On-Page Optimization | | JimSkychief0