Subdomain cannibalization
-
Hi,
I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example:
Root domain: example.com
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,...Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains.
The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop.
Thanks.
-
Hello Mat,
I don't think I'm seeing the same SERPs as you. Is there any way you could give me an example of one of these subdomains?
And yes, you're absolutely right that the same problem of keyword cannibalization would apply to subdirectories as well.
If it's the woltersk....lu domain I am getting non-secure warnings from Firefox when I try to access it.
How many different subdomains are there / will there be? Is it just shop.domain.lu and www.domain.lu or are there others? I didn't see any for "courses." or "software." in the SERP example you provided with the link. If it's just one, I think that's manageable. For example, maybe www. could focus on informational queries (e.g. JavaScript course) and shop. could focus on transactional ones (e.g. Buy Acme JavaScript course). Maybe one could focus on reviews and comparisons, or long-tail queries while the other focuses on short-tail queries. Without knowing more about the domains and your business, it is difficult for me to say. If you have three or four subdomains all going after the same keywords, that's definitely a problem and I don't think you can avoid cannibalization. At that point, it would be best to choose the strongest domain/subdomain and focus your efforts on ranking one of them instead of watering down your efforts over several.
-
Thanks for your answer Everett.
The structure was indeed created some years ago, when ranking with different subdomains wasn't really a problem. It is quite normal that there is an overlap between the webshop subdomain and other subdomains. The subdomains dive deeper into a specific part of the business (tax, legal, formations,...) but on the webshop all of these different products from the subdomains are sold.
However, for some search terms, some of the subdomains all rank on the first page. For example: https://www.google.com/search?q=successierekenaar&oq=successierekenaar&aqs=chrome.0.69i59j0.3257j0j7&sourceid=chrome&ie=UTF-8
As you can see, the root domain as well as two subdomains and a link to an app, take the first four positions in the SERP.Key question is: if there is a possible search term to rank for, but one of the subdomains already ranks for this term, can it still be used? Otherwise, it won't be easy to find a unique search term with a high enough search volume for each product, since it is a market with very specific products.
On the other hand, if subdirectories were used, it basically comes down to the same: never try to rank two pages for the same search term. -
Also, don't forget to use Google Search Console's "Property Set" feature. However, I think they're about to start auto-created property sets by aggregating subdomains soon anyway: https://www.seroundtable.com/google-search-console-domain-property-26645.html
-
The short answer to your question is: Yes, you should know what keywords each of your subdomains rank for and should adjust strategy accordingly.
The long answer is that I want to see this website because it doesn't sound like something I'd recommend doing in the first place. It used to be that subdomains were treated completely differently from the parent domain and you could, theoretically, take up the entire first page of results with your subdomains. Content mills like About.com took this to the extreme and Google responded so you don't tend to see that happen much anymore. As I understand it, Google also attempts to make a determination as to whether this is the same "site" or multiple, unrelated sites, such as site.blogspot.com subdomains and treats them accordingly.
These days, the general consensus is that you should be using subdirectories/folders instead of subdomains for a variety of reasons, unless the subdomain is for a different site, or something you don't really need to have indexed, like a closed app.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can my affiliate subdomain hurt in any way?
Hello everyone, My main website is: http://www.virtualsheetmusic.com Whereas the above site's related "affiliate" website is located on the subdomain below: http://affiliates.virtualsheetmusic.com I was wondering if having that "affiliate section" on a subdomain could affect the main website negatively in some way... or would be better to put it in a sub-folder on the main website, or even on a totally different domain. Thanks in advance for any advice!
Intermediate & Advanced SEO | | fablau0 -
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Subdomained White-Label Sites
Wanted to pass along a specific use-case that I'm thinking through in the technical setup for a client. Site: http://www.abc.com is an ecommerce company that offers the ability to white-label a site so an affiliate can join and get access to the site, and ultimately get a cut of whatever is sold through that affiliate. So I join the site and get access to scott.xyz.com and can handle my business through that. From a technical standpoint, this is the proposed technical setup of the site. Canonical URLS will be set to www.xyz.com Pages on scott.xyz.com will be set to noindex, while the main www.xyz.com will be set to be indexed Webmaster Tools for scott.xyz.com will be set to have preferred domain of www.xyz.com scott.xyz.com will have separate robots.txt instructing to block crawl Questions Am I missing any steps in properly setting up the technical background of the subdomain sites? The use of subdomains isn't something that I am able to move away from. Will any links in to scott.xyz.com pass juice and authority to www.xyz.com, or does the noindex/nocrawl block that from happening? Is there anything else that I am missing? Thanks!
Intermediate & Advanced SEO | | RosemarieReed
Scott0 -
Repeatedly target a rolling list of kws..or is that cannibalization? Biggest Confusion in SEO Ive found
Also suggesting a WBF topic. Ive read and researched with no luck here... would love a Moz staff reply too! Is it better to blog repeatedly on the same topic (writing multiple blogs around the topic of "content marketing" for example in hopes Google sees you as an authority on the topic over time) OR is this keyword cannibalization? Is it better to have one powerful and comprehensive page on a topic if it makes sense. Thanks!
Intermediate & Advanced SEO | | RickyShockley0 -
Pros and cons of seperate sites vs. subdomains
First timer and new to SEO We are designing a website for a customer in south america that has 3 distinct divisions. We want to develop the site in the most SEO effective way possible. Each division will have its own keyword focus, its own associations and its own links. They will all link to each other from the main page company.com. we were thinking of creating 4 different seperate domains such as... www.company.com - basic high level company information with links to the other external sites below. www.company-contructionsoftware.com www.company-itservices.com www.company-graphicdesign.com so my questions are: 1- is it better in the long run to have domains that have the search terms in the url like specified above? We can optimize for the main site as well as the individual sites separately 2- would the result be the same using subdomains? for example, itservices.company.com 3- possibly hosting the 3 different sites in different locations? We want to make sure that we are building using the the best possible architecture for future optimization and internet marketing. What are the pros and cons? Thanks!!!!
Intermediate & Advanced SEO | | brantwadz0 -
Move blog from subdomain to main domain on ecom site?
I am wondering what my fellow mozers think. Pretty set about my direction but want to get any other input to aid in my decision. Have an ecom site with a www.blog.maindomain.com. The blog is fairly new and no major rankings. There are only about 30 posts. This isn't a super competitive market and the blogging won't be a huge part of our content strategy but I would like to use it for passing juice etc. Would you go through the trouble to move the blog to www.site.com/blog and redirecting all the old content to new?
Intermediate & Advanced SEO | | PEnterprises0 -
Block an entire subdomain with robots.txt?
Is it possible to block an entire subdomain with robots.txt? I write for a blog that has their root domain as well as a subdomain pointing to the exact same IP. Getting rid of the option is not an option so I'd like to explore other options to avoid duplicate content. Any ideas?
Intermediate & Advanced SEO | | kylesuss12 -
Query / Discussion on Subdomain and Root domain passing authority etc
I've seen Rands video on subdomains and best pratices at
Intermediate & Advanced SEO | | James77
http://www.seomoz.org/blog/whiteboard-friday-the-microsite-mistake
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites I have a question/theory though and it is related to an issue I am having. We have built our website, and now we are looking at adding 3rd party forums and blogs etc (all part of one CMS). The problem is these need to to be on a seperate subdomain to work correctly (I won't go into the specific IT details but this is what I have been advised by my IT guru's). So I can have something like:
http://cms.mysite.com/forum/ Obviously after reading Rands post and other stuff this is far from ideal. However I have another Idea - run the CMS from root and the main website from the www. subdomain. EG
www.mysite.com
mysite.com/blog Now my theory is that because so many website (possibly the majority - especially smaller sites) don't use 301 redirects between root and www. that search engines may make an exception in this case and treat them both as the same domain, so it could possibly be a way of getting round the issue. This is just a theory of mine, based solely on my thoughts that there are so many websites out there that don't 301 root to www. or vice versa, that possibly it would be in the SE's self interest to make an exception and count these as one domain, not 2. What are your thoughts on this and has there been any tests done to see if this is the case or not? Thanks0