E-Commerce Multilanguage - Better on Subdomains?
-
Hi,
We have an e-commerce store in English and Spanish - same products. URLs differ like this:
ENGLISH:
www.mydomain.com/en/manufacturer-sku-productnameinenglish.htmlSPANISH:
www.mydomain.com/es/manufacturer-sku-productnameinspanish.htmlAll content on pages is translated, e.g, H1, Titles, keywords, descriptions and site content itself is in the language displayed.
Is there a risk of similar or near dupe content here in the eyes of the big G?
Would it be worth implementing different languages on subdomains or completely different domains?
thank you
B
-
That's exactly right. In fact, a couple of years ago many SEO's believed that sub folders would have more SEO value than subdomains, as it would ensure that all of the SEO "strength", as it were, would pass to the root domain.
We've moved away from this now as people have tested subdomains and found that the strength passes equally to them as well. Whether you go with subdomains or sub-folders, either way the SEO value will pass.
-
Hi Tom.
Thank you for your answer - Sounds good, so as far as your answer goes, using sub-folders is similar in seo value as subdomains?
Thanks again
-
Hey B
That setup will be fine - using sub-folders or sub-domains is a fine way of serving different language content.
If everything is translated as you say, there won't be any risk of duplicate content coming from the translations.
The only potential risk of dupe content would be if the product pages are "thin" on content. That is to say, if there is not enough content on the page, Google might think all of the pages look the same.
Run the website through SEO-Browser and look at how the page appears to the Google bot. Once the bot has seen all of the site navigation and the footer, is there enough on the product pages to differentiate themselves from the other. If there isn't, add more content - such as extra bits of product descriptions or reviews.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
Intermediate & Advanced SEO | | Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
301 Redirecting from domain to subdomain
We're taking on a redesign of our corporate site on our main domain. We also have a number of well established, product based subdomains. There are a number of content pages that currently live on the corporate site that rank well, and bring in a great deal of traffic, though we are considering placing 301 redirects in place to point that traffic to the appropriate pages on the subdomains. If redirected correctly, can we expect the SEO value of the content pages currently living on the corporate site to transfer to the subdomains, or will we be negatively impacting our SEO by transferring this content from one domain to multiple subdomains?
Intermediate & Advanced SEO | | Chris81980 -
.Com version of my site is ranking better than .co.uk for my UK Website for branded search. 301 redirect mess
Dear Mozzers, I have an issue with my UK Website (short url is - http://goo.gl/dJ7IgD ) whereby when I type my company name in to google.co.uk search the .com version returns in Search as opposed to the .co.uk and from looking at open site explorer the page rank of the .com is higher than the .co.uk ?. Infact I cant even see the .co.uk homepage version but other pages from my site. The .com version is also 301'd to the .co.uk. From looking at Open Site Explorer, I have noticed that we have more links pointing to .com as opposed to .co.uk. Alot of these are from our own separate microsites which we closed down last year and I have noticed the IT company who closed them down for some reason 301'd them to the .com version of our site as opposed to the .co.uk but If I look in http://httpstatus.io/ (http status checker tool) to check one of these mircosites it shows - 301 - 302 - 200 status codes which to me looks wrong ?. I am wondering what it should read ... e.g should it just be a 301 to a 200 status code ?. My Website short url is - http://goo.gl/dJ7IgD and an example of some of 10 microsites we closed down last year which seems to be redirected to .com is http://goo.gl/BkcIjy and http://goo.gl/kogJ02 As these were redirected almost a year ago - it is okay if I now get them redirected to the .co.uk version of my site or what should I do ? They currently redirect to the home page but given that each of the microsites are based on an individual category of my main site , would it be better to 301 them to the relevant category on my site. My only concern is that , may cause to much internal linking and therefore I wont have enough links on my homepage ? How would you suggest I go about building up my .co.uk authority so it ranks betters than the .com- I am guessing this is obviously affecting my rankings and I am losing link juice with all this. Any advice greatly appreciated . thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Which is better /section/ or section/index.php?
I have noticed that Google has started to simply link to /section/ as opposed to /section/index.php and I haven't changed any canonical tags etc. I have looked at my pages moz authority for the two /section/ = 28/100
Intermediate & Advanced SEO | | TimHolmes
/section/index.php = 42/100 How would I go about transferring the authority to /section/ from /section/index.php to hopefully help me in my organic serp positions etc. Any insight would be great 🐵0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Reverse Proxy better than 301 redirect?
Are reverse proxies that much better than 301 redirects? Should I invest the time in doing this? I found out about reverse proxies here: http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo
Intermediate & Advanced SEO | | brianmcc0 -
Block an entire subdomain with robots.txt?
Is it possible to block an entire subdomain with robots.txt? I write for a blog that has their root domain as well as a subdomain pointing to the exact same IP. Getting rid of the option is not an option so I'd like to explore other options to avoid duplicate content. Any ideas?
Intermediate & Advanced SEO | | kylesuss12 -
Paging. is it better to use noindex, follow
Is it better to use the robots meta noindex, follow tag for paging, (page 2, page 3) of Category Pages which lists items within each category or just let Google index these pages Before Panda I was not using noindex because I figured if page 2 is in Google's index then the items on page 2 are more likely to be in Google's index. Also then each item has an internal link So after I got hit by panda, I'm thinking well page 2 has no unique content only a list of links with a short excerpt from each item which can be found on each items page so it's not unique content, maybe that contributed to Panda penalty. So I place the meta tag noindex, follow on every page 2,3 for each category page. Page 1 of each category page has a short introduction so i hope that it is enough to make it "thick" content (is that a word :-)) My visitors don't want long introductions, it hurts bounce rate and time on site. Now I'm wondering if that is common practice and if items on page 2 are less likely to be indexed since they have no internal links from an indexed page Thanks!
Intermediate & Advanced SEO | | donthe0