Ecommerce store on subdomain - danger of keyword cannibalization?
-
Hi all,
Scenario: Ecommerce website selling a food product has their store on a subdomain (store.website.com). A GOOD chunk of the URLs - primarily parameters - are blocked in Robots.txt. When I search for the products, the main domain ranks almost exclusively, while the store only ranks on deeper SERPs (several pages deep).
In the end, only one variation of the product is listed on the main domain (ex: Original Flavor 1oz 24 count), while the store itself obviously has all of them (most of which are blocked by Robots.txt).
Can anyone shed a little bit of insight into best practices here? The platform for the store is Shopify if that helps. My suggestion at this point is to recommend they all crawling in the subdomain Robots.txt and canonicalize the parameter pages.
As for keywords, my main concern is cannibalization, or rather forcing visitors to take extra steps to get to the store on the subdomain because hardly any of the subdomain pages rank. In a perfect world, they'd have everything on their main domain and no silly subdomain.
Thanks!
-
I posted a bit of a Reddit rant here under my personal SEO alias of "studiumcirclus":
(click "View Entire Discussion")
Mainly these things vex me about the platform:
"In basic terms, Shopify is limited by its vision. They want to make sites easy to design for the average-joe, which means they have to spend most of their platform dev time on the back-end of the system and not the front-end of the sites which it produces
_ If they're always bogged down making extra tick-boxes to change things in the back-end, how can they be keeping up with cutting edge SEO? With WordPress you have a much larger dev community making add-ons, many of them completely free and still very effective. Because everyone is on WP, when new Google features, directives or initiatives come out they are quickly embraced (putting all sites on WP one step ahead)_
_ With smaller dev communities, platforms like Shopify or Magento lag behind. Why do people always expect that 'average' will rank 'well'? Ahead of the curve ranks well, average ranks averagely_
_ Also Shopify has some nasty Page-Speed issues which they won't acknowledge and they just argue about instead of fixing things. It's just not good for SEO_"
Other "Shopify is bad" evidence:
https://moz.com/community/q/main-menu-duplication#reply_391855 - just contains some of my thoughts on why Shopify isn't that good
https://moz.com/community/q/site-crawl-status-code-430 - a relatively recent problem someone had with their Shopify site, scroll down to see my reply
https://moz.com/community/q/duplicate-content-in-shopify-subsequent-pages-in-collections - someone else having tech issues with their Shopify site. While my answer was probably right, they probably couldn't implement the fixes
-
This was incredibly helpful. Right now their funnel starts on the store (adding product to cart), but there's definitely a benefit to it starting on the main domain to better track how the channels perform and overall user behavior.
-
In summary - firstly echo effectdigital on Shopify. It is an interesting platform sold very well by Shopify zealots - but we have had to bend too many times to Shopify platform limitations to believe it is the right answer for most. It is awesome if your a bikini start-up with no CRM or ERP - however the moment it comes to a decent integration - it often gets ugly quickly.
On to your query - the shortened version to the answer is no-one knows. Why? because the algorithm treats subdomains differently for different sites. https://moz.com/blog/interview-searchlove There is a good piece on subdomains v subfolders in this WBF. In summary a good discussion on subdomains.
The click through to the subdomain should be a normal step, ie so assuming on the subdomain your landing on the relevant contextual page within the funnel to transact. That is normal for some back ends. You are correct ideally in my view all on the root domain.
Overall if the subdomain pages are critical and you want to rank, then need to treat subdomain for SEO as a separate site. However, if the subdomain is just the end part of the sales funnel.. then may not need to rank..
Hope that is helpful.
Regards
-
One reason we got out of shopify. Gets complicated quickly. There was a brilliant WBF on subdomains about 2 months ago - by the british dude from distilled who pops up from time to time. Will try and find it if get time, but would check that out as a starting point.
-
Yeah, I'm trying to figure out the best way to present to them all the pertinent information regarding how terrible Shopify is. The way they use Collections then block any sort of parameters in their unalterable Robots.txt file is insane.
-
That sounds like a hell of a mess. Instead of tying your name to one proposed implementation and saying "yes, this IS the way" - I'd get the complexity of the issue across to the client / boss
I'd then present your idea and say "I want to test this, but if results suffer we will need to revert the changes". I think that with such a complex architectural nightmare (on a HORRIBLE platform like Shopify, which is just awful for SEO) - it would be extremely foolish to charge off into the night without making the risks clear
The best practice is really to not have built such a terrible site to begin with. In making things better, there may be growing pains. There may be NO options which would result in 100% growth and 0% losses
My recommendation would be to continue blocking Google's access to the original, default product variations (as those are already happily ranking on the main site. Don't fix what ain't broken). I might allow Google to crawl the sub-variations which are inaccessible from the main site. I might alter the main site's UX to include links to the sub-variants on the 'shop.' subdomain
In the end though, it's a very tangled web they have spun
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference keyword and co-occurence
Could someone explain me what the difference between a keyword and a co-occurence is ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
High or low volume keyword
Does it take longer to rank on high volume versus a low volume keyword ? if so why Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Can subdomains avoid spam penalizations?
Hello everyone, I have a basic question for which I couldn't find a definitive answer for. Let's say I have my main website with URL: www.mywebsite.com And I have a related affiliates website with URL: affiliates.mywebsite.com Which includes completely different content from the main website. Also, both domains have two different IP addresses. Are those considered two completely separate domains by Google? Can bad links pointing to affiliates.mywebsite.com affect www.mywebsite.com in any way? Thanks in advance for any answer to my inquiry!
Intermediate & Advanced SEO | | fablau0 -
Local ranking (keyword) strategies
Hello SEOmozers, I've been working on improving all components of my SEO skills for the past 6 months. I have definitely had some great victories and some gray defeats. My newest challenge is local ranking for a home improvement company. My target is to rank them locally with Google within the top 7 results. I have managed to do so, but only for one keyword "windows and doors CITY". My campaign, in terms of anchor text has a wide variety of long and shortail keywords, I have not concentrated on the above keyword. My question is, how do I go about to rank this website in the local results for all other keywords "windows CITY", "window replacement CITY", etc... What I don't understand is how Google picks up which keywords to rank the website locally for, and which ones to ignore. Any information will be well received. Cheers, Nikster
Intermediate & Advanced SEO | | thenikster0 -
URL - Keywords
My domain name contains my top two keywords. Am I penalized if I create another page where I add my domain key words a 2nd time after the domain name along with a subcategory and the name of a state. I don't know what white hat and black hat is so I want to make sure I stay white hat. Also I didn't know it but is it true that your title shows up in your domain name?
Intermediate & Advanced SEO | | Boodreaux0 -
Should I redirect secondary keyword page
I have a reasonably high authority home page and have decided to optimise the home page to target a competitive keyword that previously had a specific page that was optimised on an internal page of my site that I have spent time building links to. The internal page has over 200 links to it so should I 301 redirect this internal page to the home page. Will that increase the auhority of the home page further? Or should I keep the internal page as a 'secondary' page for that keyword?. If I do have two pages don't I risk confusing google?
Intermediate & Advanced SEO | | SamCUK0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0 -
Service Keyword in URL - too much?
We're working on revamping the URL structure for a site from the ground up. This firm provides a service and has a library of case studies to back up their work. Here's some options on URL structure: 1. /cases/[industry keyword]-[service keyword] (for instance: /cases/retail-pest-control) There is some search traffic for the industry/service combination, so that would be the benefit of using both in URL. But we'd end up with about 70 pages with the same service keyword at the end. 2. /cases/[industry keyword] (/cases/retail) Shorter, less spam potential, but have to optimize for the service keyword -- the primary -- in another way. 3. /cases/clientname (/cases/wehaveants) No real keyword potential but better usability. We also want the service keyword to rank on its own on another page (so, a separate "pest control" page). So don't want to dilute that page's value even after we chase some of the long tail traffic. Any thoughts on the best course of action? Thanks!
Intermediate & Advanced SEO | | kdcomms1