Subdomain cannibalization
-
Hi,
I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example:
Root domain: example.com
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,...Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains.
The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop.
Thanks.
-
Hello Mat,
I don't think I'm seeing the same SERPs as you. Is there any way you could give me an example of one of these subdomains?
And yes, you're absolutely right that the same problem of keyword cannibalization would apply to subdirectories as well.
If it's the woltersk....lu domain I am getting non-secure warnings from Firefox when I try to access it.
How many different subdomains are there / will there be? Is it just shop.domain.lu and www.domain.lu or are there others? I didn't see any for "courses." or "software." in the SERP example you provided with the link. If it's just one, I think that's manageable. For example, maybe www. could focus on informational queries (e.g. JavaScript course) and shop. could focus on transactional ones (e.g. Buy Acme JavaScript course). Maybe one could focus on reviews and comparisons, or long-tail queries while the other focuses on short-tail queries. Without knowing more about the domains and your business, it is difficult for me to say. If you have three or four subdomains all going after the same keywords, that's definitely a problem and I don't think you can avoid cannibalization. At that point, it would be best to choose the strongest domain/subdomain and focus your efforts on ranking one of them instead of watering down your efforts over several.
-
Thanks for your answer Everett.
The structure was indeed created some years ago, when ranking with different subdomains wasn't really a problem. It is quite normal that there is an overlap between the webshop subdomain and other subdomains. The subdomains dive deeper into a specific part of the business (tax, legal, formations,...) but on the webshop all of these different products from the subdomains are sold.
However, for some search terms, some of the subdomains all rank on the first page. For example: https://www.google.com/search?q=successierekenaar&oq=successierekenaar&aqs=chrome.0.69i59j0.3257j0j7&sourceid=chrome&ie=UTF-8
As you can see, the root domain as well as two subdomains and a link to an app, take the first four positions in the SERP.Key question is: if there is a possible search term to rank for, but one of the subdomains already ranks for this term, can it still be used? Otherwise, it won't be easy to find a unique search term with a high enough search volume for each product, since it is a market with very specific products.
On the other hand, if subdirectories were used, it basically comes down to the same: never try to rank two pages for the same search term. -
Also, don't forget to use Google Search Console's "Property Set" feature. However, I think they're about to start auto-created property sets by aggregating subdomains soon anyway: https://www.seroundtable.com/google-search-console-domain-property-26645.html
-
The short answer to your question is: Yes, you should know what keywords each of your subdomains rank for and should adjust strategy accordingly.
The long answer is that I want to see this website because it doesn't sound like something I'd recommend doing in the first place. It used to be that subdomains were treated completely differently from the parent domain and you could, theoretically, take up the entire first page of results with your subdomains. Content mills like About.com took this to the extreme and Google responded so you don't tend to see that happen much anymore. As I understand it, Google also attempts to make a determination as to whether this is the same "site" or multiple, unrelated sites, such as site.blogspot.com subdomains and treats them accordingly.
These days, the general consensus is that you should be using subdirectories/folders instead of subdomains for a variety of reasons, unless the subdomain is for a different site, or something you don't really need to have indexed, like a closed app.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Cannibalization on Professional Service Firm
Hi all: We do ongoing SEO for a tax law firm. Their home page, which contains very little text is marked up in the title tag with the phrase 'tax attorneys and preparers.' We are getting warnings from our SEO software that individual bio pages for practitioners are cannibalizing the homepage for the keyword 'tax attorney.' Should I be concerned? The head of this firm is a very well known 'tax attorney.' Its kind of hard to describe him differently but we keep getting told his page competes with the firm's homepage for this search string. Thanks in advance.
Intermediate & Advanced SEO | | Daaveey1 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Getting a Sitemap for a Subdomain into Webmaster Tools
We have a subdomain that is a Wordpress blog, and it takes days, sometimes weeks for most posts to be indexed. We are using the Yoast plugin for SEO, which creates the sitemap.xml file. The problem is that the sitemap.xml file is located at blog.gallerydirect.com/sitemap.xml, and Webmaster Tools will only allow the insertion of the sitemap as a directory under the gallerydirect.com account. Right now, we have the sitemap listed in the robots.txt file, but I really don't know if Google is finding and parsing the sitemap. As far as I can tell, I have three options, and I'd like to get thoughts on which of the three options is the best choice (that is, unless there's an option I haven't thought of): 1. Create a separate Webmaster Tools account for the blog 2. Copy the blog's sitemap.xml file from blog.gallerydirect.com/sitemap.xml to the main web server and list it as something like gallerydirect.com/blogsitemap.xml, then notify Webmaster Tools of the new sitemap on the galllerydirect.com account 3. Do an .htaccess redirect on the blog server, such as RewriteRule ^sitemap.xml http://gallerydirect.com/blogsitemap_index.xml Then notify Webmaster Tools of the new blog sitemap in the gallerydirect.com account. Suggestions on what would be the best approach to be sure that Google is finding and indexing the blog ASAP?
Intermediate & Advanced SEO | | sbaylor0 -
Better SEO Option, 1 Site 3 Subdomains or 4 Separate Sites?
Hey Mozzers, I'm working with a client who wants to redo their web presence. They have a a main website for the umbrella and then 3 divisions which have their own website as well. My question is: Is it better to have the main site on the main domain and then have the 3 separate sites be subdomains? Or 4 different domains with a linking structure to tie them all together? To my understanding option 1 would include high traffic for 1 domain and option 2 would be building Page Authority by having 4 different sites linking to each other? My guess would be option 2, only if all 4 sites start getting relevant authority to make the links of value. But right out of the gates option 1 might be more beneficial. A little advice/clarification would be great!
Intermediate & Advanced SEO | | MonsterWeb280 -
Links from a website or a subdomain, which would generate more benefits in terms of SEO?
I have a customer who just bought a domain (and the full website) of a competitor and decided that they will no longer update the website purchased. The website of my client has a Domain Authority = 50 and DA of the website purchased is 45. Each of them was registered by different companies and are on different servers too.
Intermediate & Advanced SEO | | marciofelias
The reason for my message is that by being registered by different companies and are in differrent servers I can use the site purchased as a way to make link building to the main website (one way link buiding only from the website purchased to the main website), but I can put the website purchased as a subdomain of the main website and agregate content to the main website.
In your opinion which would generate more benefits in terms of SEO to the main website? Links from the website purchased or put this website as a subdomain to the main website?0 -
Subdomain vs root which is better for SEO
We run a network of sites that we are considering consolidating into one main site with multiple categories. Which would be better having each of the "topics / site" reside in subdomains or as a sub-folder off of the root? Pros and cons of each would be great. Thanks, TR
Intermediate & Advanced SEO | | DisMedia0 -
How to remove an entire subdomain from the Google index with URL removal tool?
Does anyone have clear instructions for how to do this? Do we need to set up a separate GWT account for each subdomain? I've tried using the URL removal tool, but it will only allow me to remove URLs indexed under my domain (i.e. domain.com not subdomain.domain.com) Any help would be much appreciated!!!
Intermediate & Advanced SEO | | nicole.healthline0