Ecommerce store on subdomain - danger of keyword cannibalization?
-
Hi all,
Scenario: Ecommerce website selling a food product has their store on a subdomain (store.website.com). A GOOD chunk of the URLs - primarily parameters - are blocked in Robots.txt. When I search for the products, the main domain ranks almost exclusively, while the store only ranks on deeper SERPs (several pages deep).
In the end, only one variation of the product is listed on the main domain (ex: Original Flavor 1oz 24 count), while the store itself obviously has all of them (most of which are blocked by Robots.txt).
Can anyone shed a little bit of insight into best practices here? The platform for the store is Shopify if that helps. My suggestion at this point is to recommend they all crawling in the subdomain Robots.txt and canonicalize the parameter pages.
As for keywords, my main concern is cannibalization, or rather forcing visitors to take extra steps to get to the store on the subdomain because hardly any of the subdomain pages rank. In a perfect world, they'd have everything on their main domain and no silly subdomain.
Thanks!
-
I posted a bit of a Reddit rant here under my personal SEO alias of "studiumcirclus":
(click "View Entire Discussion")
Mainly these things vex me about the platform:
"In basic terms, Shopify is limited by its vision. They want to make sites easy to design for the average-joe, which means they have to spend most of their platform dev time on the back-end of the system and not the front-end of the sites which it produces
_ If they're always bogged down making extra tick-boxes to change things in the back-end, how can they be keeping up with cutting edge SEO? With WordPress you have a much larger dev community making add-ons, many of them completely free and still very effective. Because everyone is on WP, when new Google features, directives or initiatives come out they are quickly embraced (putting all sites on WP one step ahead)_
_ With smaller dev communities, platforms like Shopify or Magento lag behind. Why do people always expect that 'average' will rank 'well'? Ahead of the curve ranks well, average ranks averagely_
_ Also Shopify has some nasty Page-Speed issues which they won't acknowledge and they just argue about instead of fixing things. It's just not good for SEO_"
Other "Shopify is bad" evidence:
https://moz.com/community/q/main-menu-duplication#reply_391855 - just contains some of my thoughts on why Shopify isn't that good
https://moz.com/community/q/site-crawl-status-code-430 - a relatively recent problem someone had with their Shopify site, scroll down to see my reply
https://moz.com/community/q/duplicate-content-in-shopify-subsequent-pages-in-collections - someone else having tech issues with their Shopify site. While my answer was probably right, they probably couldn't implement the fixes
-
This was incredibly helpful. Right now their funnel starts on the store (adding product to cart), but there's definitely a benefit to it starting on the main domain to better track how the channels perform and overall user behavior.
-
In summary - firstly echo effectdigital on Shopify. It is an interesting platform sold very well by Shopify zealots - but we have had to bend too many times to Shopify platform limitations to believe it is the right answer for most. It is awesome if your a bikini start-up with no CRM or ERP - however the moment it comes to a decent integration - it often gets ugly quickly.
On to your query - the shortened version to the answer is no-one knows. Why? because the algorithm treats subdomains differently for different sites. https://moz.com/blog/interview-searchlove There is a good piece on subdomains v subfolders in this WBF. In summary a good discussion on subdomains.
The click through to the subdomain should be a normal step, ie so assuming on the subdomain your landing on the relevant contextual page within the funnel to transact. That is normal for some back ends. You are correct ideally in my view all on the root domain.
Overall if the subdomain pages are critical and you want to rank, then need to treat subdomain for SEO as a separate site. However, if the subdomain is just the end part of the sales funnel.. then may not need to rank..
Hope that is helpful.
Regards
-
One reason we got out of shopify. Gets complicated quickly. There was a brilliant WBF on subdomains about 2 months ago - by the british dude from distilled who pops up from time to time. Will try and find it if get time, but would check that out as a starting point.
-
Yeah, I'm trying to figure out the best way to present to them all the pertinent information regarding how terrible Shopify is. The way they use Collections then block any sort of parameters in their unalterable Robots.txt file is insane.
-
That sounds like a hell of a mess. Instead of tying your name to one proposed implementation and saying "yes, this IS the way" - I'd get the complexity of the issue across to the client / boss
I'd then present your idea and say "I want to test this, but if results suffer we will need to revert the changes". I think that with such a complex architectural nightmare (on a HORRIBLE platform like Shopify, which is just awful for SEO) - it would be extremely foolish to charge off into the night without making the risks clear
The best practice is really to not have built such a terrible site to begin with. In making things better, there may be growing pains. There may be NO options which would result in 100% growth and 0% losses
My recommendation would be to continue blocking Google's access to the original, default product variations (as those are already happily ranking on the main site. Don't fix what ain't broken). I might allow Google to crawl the sub-variations which are inaccessible from the main site. I might alter the main site's UX to include links to the sub-variants on the 'shop.' subdomain
In the end though, it's a very tangled web they have spun
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will deindexing a subdomain negate the benefit of backlinks leading to that subdomain?
My client has a subdomain from their main site where their online waiver tool lives. Currently, all the waivers generated by users are creating indexed pages, I feel they should deindex that subdomain entirely. However, a lot of their backlinks are from their clients linking to their waivers. If they end up deindexing their subdomain, will they lose the SEO benefit of backlinks pointing to that subdomain? Thanks! Jay
Intermediate & Advanced SEO | | MCC_DSM0 -
Difference keyword and co-occurence
Could someone explain me what the difference between a keyword and a co-occurence is ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Need to know if this could be dangerous
We have recently launched an store, and we had created 301 redirects from the old product pages to the new ones. But we found that the new website performance was a quite slow, and that it would improve a lot if you add a number at the end of the product page url. But that implies to change the 301 redirect and to add a new one for the pages that has already been updated by the search engine. I mean. www.store.com/oscommerce/catalog/productpage.html has been now 301 redirected to www.store.com/sectionname/productpage this was 48 hours ago. But now, (I know it, we should check this before...) we need to change the url to www.store.com/sectionname/productpage/5487 this way the performance improves a lot. But we are afraid of doing two 301 in just 48 hours. any advice?
Intermediate & Advanced SEO | | teconsite
Shoud we find another way of solving the performance issue? Thanks0 -
Worth removing keywords...?
I was just going over a site I manage and noticed it had a load of meta keywords on it. Probably 10-15 keywords per page... Do you think this is harming the site? Is it worth removing them?
Intermediate & Advanced SEO | | JohnW-UK0 -
Best Keyword Taxonomy Discussion
Sorry to bring this up again but I think the title was very misleading resulting in helpful members ignoring the question/thread completely. Also, I believe this should be in the discussion section, but please correct me if I'm wrong? Hi All, This is my first post and hopefully a question that could help others in similar positions as I haven't been able to find a concrete answer on this anywhere. Say we are trying to rank for the keyword "security testing tools". Product name is "Sectest" and its a security testing tool. *We currently have an "SEO" section that is purely good content and the idea with this is to be able to rank for "security testing tools" talking about what to expect and look for in such tools and relevant content - Linking to our product page at the end of it. structure is brand.com/security-testing/tools and that would have a link to brank.com/products/sectest Obviously product pages would get their meta tags and content re-written so we don't compete for the same keywords. Is this approach optimal? or would google want us to link directly to the product page instead of "information" about security testing tools? Nobody in our sector is taking this approach and we have already started it, but I am starting to wonder if I am getting into big trouble further down the line. Thanks and best regards, 2 Responses<a class="image-button add-response-button"> </a><a name="post-131828"></a> | JorgeGarciaAspirant | about 22 hours ago |JorgeGarcia Just to make it clearer. Our competitors seem to be using "security testing tools" directly in their product pages. We would like to use "security testing tools" for a page with content on it and an introduction to our product and then link to our product page. | <a name="post-131872"></a> | SEO5Journeymen | SEO5Director - Marketing at SEO 5 Consulting Hi Jorge, How are your competitors ranking for their approach by using security testing tools directly. If they are doing well then i would adopt the same strategy and try to beat them with quality backlinks and good on site optimization. SEO is not the only thing you have to worry about , you also should keep conversion rates in mind. By first taking the visitors to a security tools page and then your product page you are increasing your conversion funnel and this might impact your conversion rates. At the end of the day , it's all about sales/revenue/leads/ROI so you dont want to do anything to jeopardize your conversions. That one extra step that the visitor has to take might result in fewer conversions. <a class="image-button add-response-button"> </a> | <a name="post-131946"></a> | JorgeGarcia |
Intermediate & Advanced SEO | | JorgeGarcia
JorgeGarcia Hi there, Although I do understand your reasoning, we have the resources and people quantity to focus on all things at once being a big a company. So at the present moment it wouldn't be a matter of prioritizing work - but rather - delivering the best future-proof strategy. I don't mind doing the same as our competitors, but sometimes stepping out of the sheep line is good. You do make a great and very valid point addressing that this is an extra step for the visitor and could lead to fewer conversions. This is holding me back a little bit. But, if properly implemented, wouldn't a content focused site rank way better than a product page would? I guess the real question is if prospects would really find value in the information about "security testing tools" or they would rather just get the product page instead. But just looking from Google eyes, what do you think of this approach? _After re-reading my post I realize I might sound as if all I want is you to agree with me and justify my approach, I don't really. I would really value any honest thoughts and reasoning 🙂 _ |0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0 -
Techniques to fix eCommerce faceted navigation
Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
Intermediate & Advanced SEO | | anthematic
< li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
< li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.0 -
How to get subdomains to rank well?
Hi All, I am setting up a new site and I want to make use of subdomains to target multiple countries as follows: uk.mydomain.com us.mydomain.com australia.mydomain.com etc. Now i know what you're all going to say, why not use folders as they are more effective. Well I did think of this but decided against it because I would like to make the best of a low competition industry. I want to push my competitors as far down in the SE's as possible and i plan to do this by targeting generic non locational search terms with both sites so I can hog the top 4 spots.as follows: www.mydomain.com www.mydomain.com/keyterm uk.mydomain.com uk.mydomain.com/keyterm-in-the-UK Whats steps can I take to ensure rank passes to my subdomains? Is it better to start the site with folders like www.mydomain.com/us/keyterm and then 301 them to subdomains at a later stage or should i start with the subdomains?
Intermediate & Advanced SEO | | Mulith1