Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Subdomain from Google Crawl and Index
-
Hey everybody, how is it going?
I have a simple question, that i need answered.
I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more.
What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc?
Hope to hear from you,
Best Regards,
-
Hello George, Thank you for fast answer! I read that article and there is some issue with that. if you can see at it, i'd really appreciate it. So the problem is that if i do it directly from Tumblr, it will also block it from Tumblr users. Here is the note right below that option "Allow this blog to appear in search results":
"This applies to searches on Tumblr as well as external search engines, like Google or Yahoo."Also, if i do it from GWT, i'm very concerned to remove URLs with my subdomain because i afraid it will remove all my domain. For example, my domain is abc.com and the Tumblr blog is setup on tumblr.abc.com. So i afraid if i remove tumblr.abc.com from index, it will also remove my abc.com. Please let me know what you think.
Thank you!
-
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
-
Hi guys, I read your conversation. I have similar issue but my situation is slightly different. I'll really appreciate if you can help with this. So i have also a subdomain that i don't want to be indexed by Google. However, that subdomain is not in my control. I mean, i created subdomain on my hosting but it is pointing to my Tumblr blog. So i don't have access to its robot txt. So can anybody advise what can i do in this situation to noindex that subdomain?
Thanks
-
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
-
It would also be smart to add the subdomains in Webmaster Tools in case one does get indexed and you need to remove it.
-
Robots.txt is easiest and quickest way. As a back up you can use the Noindex meta tag on the pages in the subdomain
-
2 ways to do it with different effects:
-
Robots.txt in each subdomain. This will entirely block any search engine to even access those pages, so they won't know what they have inside.
User-Agent:*
Disallow: /
-
noindex tags in those pages. This method allows crawlers to read the page and maybe index (if you set a "follow") the pages to which you link to.or "nofollow" if you don't want the linked pages to be indexed either.
Hope that helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
403 Forbidden Crawl report
Hi, I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
On-Page Optimization | | ghrisa65
Some of the links:
https://www.medistaff24.co.uk/contact-us/ https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/ https://www.medistaff24.co.uk/hourly-home-care-in-evesham/0 -
Virtual URL Google not indexing?
Dear all, We have two URLs: The main URL which is crawled both by GSC and where Moz assigns our keywords is: https://andipaeditions.com/banksy/ The second one is called a virtual url by our developpers: https://andipaeditions.com/banksy/signedandunsignedprintsforsale/ This is currently not indexed by Google. We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/ Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal. Should I redirect from the second to the first? Thank you
On-Page Optimization | | TAT1000 -
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Does Google penalize you for reindexing multiple URLS?
Hello, Just a quick, question! I was wanting to know if multiple page indexing (site overhaul) could cause a drop in organic traffic ranking or be penalized by Google for submitting multiple pages at one time. Thanks
On-Page Optimization | | InternetRep0 -
Google Console returning 0 pages as being indexed
HI there, I submitted my site notebuster.net to Search Console over a month ago and it is showing 0 pages as being indexed under the index status report. I know this isn't right as I can see that in google alone by typing in (site:notebusters.net) there are 113 pages indexed. Any idea why this might be? Thanks
On-Page Optimization | | CosiCrawley0 -
Does Rel=canonical affect google shopping feed?
I have a client who gets a good portion of their sales (~40%) from Google Product Feeds, and for those they want each (Product X Quantity) to have it’s own SKU, as they often get 3 listings in a given Google shopping query, i.e. 2,4,8 units of a given product. However, we are worried about this creating duplicate content on the search side. Do you know if we could rel=canonical on the site without messing with their google shopping results? The crux of the issue is that they want the products to appear distinct for the product feed, and unified for the web so as not to dilute. Thoughts?
On-Page Optimization | | VISISEEKINC0 -
Multiple domains vs single domain vs subdomains ?
I have a client that recently read an article that advised him to break up his website into various URL's that targeted specific products. It was supposed to be a solution to gain footing in an already competitive industry. So rather than company.com with various pages targeting his products, he'd end up having multiple smaller sites: companyClothing.com companyShoes.com Etc. The article stated that by structuring your website this way, you were more likely to gain ranking in Google by targeting these niche markets. I wanted to know if this article was based on any facts. Are there any benefits to creating a new website that targets a specific niche market versus as a section of pages on a main website? I then began looking into structuring each of these product areas into subdomains, but the data out there is not definitive as to how subdomains are viewed by Google and other search engines - more specifically how subdomains benefit (or not!) the primary domain. So, in general, when a business targets many products and services that cover a wide range - what is the best way to structure the delivery of this info: multiple domains, single domain with folders/categories, or subdomains? If single domain with folders/categories are not an option, how do subdomains stack up? Thanks in advance for your help/suggestions!
On-Page Optimization | | dgalassi0 -
Changing page titles and google penalties?
I just recently learned that changing your page title earns you a google penalty. Unfortunately i learned this after playing around with my page titles a bit to get the most optimal page titles. Does anybody know how long this google penalty lasts? is it forever? or just temporary?
On-Page Optimization | | adriandg0