Should I exclude my knowledge center subdomain from indexing?
-
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing?
Thank you
-
Well, the advantage of having sub-domain is that you can target a specific audience with that specific sub-domains since Google'll treat it as it's won unique site.
The biggest disadvantage is that if you don’t do it right, you will not get the expected results you could even draw traffic from your main domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does subdomain hurt SEO on main site
This client sells event management software and puts all their clients on different subdomains of their main domain. Looking in SEO tools like OSE, when I run a backlink analysis, it pulls up all the backlinks to the subdomains as well as those for the main domain. In webmaster tools when I look at queries, impressions and clicks, they get at least 30 times more traffic and impressions on keywords found in their subdomains and very few on their own. In other words, all these tools are providing a collective analysis of main domain and all subdomains. All the backlinks and keywords recorded for those subdomains are not at all relevent to the keywords they want to rank for. For example, their software supports Boy Scouts, so keywords they rank for according to WT include merit badge, scout camp, etc., but of course, that's on the subdomain. As a result, if you were to take a snapshot of their online presence as these tools do, you would think they were a boy scout website and not a software developer if you include the subdomain, along with its PR, backlinks, keywords, etc. So the question I have is, does Google connect all these subdomains with the main domain and then water down the main site with irrelevant keywords, content and backlinks? Or does Google see all those subdomains as completely separate and we don't need to worry or move their clients off their subdomain? I'm worried about Google assigning a "boy scout" relevancy to them. Am I wrong? What would you do?
Intermediate & Advanced SEO | | katandmouse0 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
Recovering from index problem (Take two)
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down! Below is my original message. Afterwards, I've added some update info. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. UPDATE OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week. Any ideas would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
Subdomains for US Regions
The company I work for is expanding their business to new territories. I've got a lot of stabilization to do in the region/state where we're one of the most well known companies of our kind. Currently, we have 3 distinct product lines which are currently distinguished by 3 separate URLS. This is affecting the user flow of our site, so we'd like to clean it up before launching our products into the various regions. The business has decided to grow into 5 new states (one state consisting of one county only) — none of which will feature all 3 products. Our homebase state is the only one that will have all 3 products this year. My initial thought was to use subdomains to separate out the regions, that way we could use a canonical tag to stabilize the root domain (which would feature home state content, and support content for all regions), and remove us from potential duplicate content penalization. Our product content will be nearly identical across the regions for the first year. I second guessed myself by thinking that it was perhaps better to use a "[product].root/region" URL instead. And I'm currently stuck by wondering if it was not better to build out subdomains for products and regions...using one modifier or the other as a funnel/branding page into the other. For instance, user lands on "region.root.com" and sees exactly what products we offer in that region. Basically, a tailored landing page. Meanwhile the bulk of the product content would actually live under "product.root.com/region/page". My head is spinning. And while searching for similar questions I also bumped into reference of another tag meant to be used in some similar cases to mine. I feel like there's a lot of risks involved in this subdomain strategy, but I also can't help but see the benefits in the user flow.
Intermediate & Advanced SEO | | taylor.craig0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
Is this link being indexed?
link text Deadline: Monday, Sep 30, 2013 link text I appreciate the help guys!
Intermediate & Advanced SEO | | jameswalkerson0 -
Pros and cons of seperate sites vs. subdomains
First timer and new to SEO We are designing a website for a customer in south america that has 3 distinct divisions. We want to develop the site in the most SEO effective way possible. Each division will have its own keyword focus, its own associations and its own links. They will all link to each other from the main page company.com. we were thinking of creating 4 different seperate domains such as... www.company.com - basic high level company information with links to the other external sites below. www.company-contructionsoftware.com www.company-itservices.com www.company-graphicdesign.com so my questions are: 1- is it better in the long run to have domains that have the search terms in the url like specified above? We can optimize for the main site as well as the individual sites separately 2- would the result be the same using subdomains? for example, itservices.company.com 3- possibly hosting the 3 different sites in different locations? We want to make sure that we are building using the the best possible architecture for future optimization and internet marketing. What are the pros and cons? Thanks!!!!
Intermediate & Advanced SEO | | brantwadz0