Best way to handle with a Multiple Langue Issue
-
Howdy everyone!
I am having trouble with deciding on how to solve a Multiple Language Issue. I went to several consultants here in Mexico, but i am not totally convinced on their solution. So i am hoping to hear excellent advice (as always) here on the SEOMOZ Forum.
I have a website, (www.aceromart.com) an ecommerce site in which to sell materials for the construction industry. My main market is in Mexico, therefore the website is in Spanish. However, the export sales are increasing and some of the suppliers are from USA.
I need to include the English version on my website, however i dont know how to do this.
- The option that they propose is to create an entirely different website with another domain. Entirely in English that links to my existing website.
I did not like this at all. Mainly because i would not like to change the brand name or anything. I would like to include the English version within my website. Is it okay, if treat it as a folder withing my website. I.E www.aceromart.com/english/content?
What is the best way to do this according to your experience.
Best Regards, thanks for your help.
-
Excellent,
Thanks for your advice, i love the idea of the IP tracking,
Regards,
-
what Moosa says is right, you can't have more domain for the same Ecommerce, you must have only one domain with a cms multi language.
You can see an example with our works
Always the same domain, but you can change the navigation language with one click.
If you have need a ica send you a little documentation about this cms in .net
Maurizio
-
I don’t see any need of another website targeting another location... absolutely no need of it! You can go with the same website and handle the English version either in subdirectory or sub domain.
or
You can also set the IP tracking so if someone is coming to your website from other parts of the world then Mexico they will see the English version of it and if people are coming from Mexico they will see the Spanish version of it...
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
Woocommerce URL Structure Issue
Hi everyone ! To put you in context, I am doing an audit on an E-Commerce site selling auto parts with WooCommerce. I have some concerns regarding the url structure and here's why: Product category page url: /auto/drivetrain/cv-axle-shaft-assembly/
Technical SEO | | alexrbrg
Product page url included in the product category page: /product/acura-integra-cv-axle-shaft-90-01-honda-civic/ The way I see my situation is that the product page is considered by Google as an intern link and not as a page included in in the subfolder of the category page. 1. Am I right?
2. If yes, is there a solution to fix the issue with woocommerce to improve the category page ranking ? Thanks y'all !0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Magento Rewrite Issue
Moz's Crawler has thrown up a bunch of crawl issue for my site.The site is a magento based site and I recently updated the themes so some routes may have have become redundant. Moz has identified 289 pages with Temporary Redirect. I thought magento managed the redirects if I set the "Auto-redirect to Base URL" to Yes(301 Moved permanently). But this is enabled on my store and I still get the errors. The only thing I could think of was to add a Robots.txt and handle the redirection of these links from here. But handling redirection for 289 links is no mean task. I was looking for any ideas that could fix this without me manually doing this .
Technical SEO | | abhishek19860 -
Redirects - How Best to do this ?
Hi I am looking to close Website A which has many pages. I would like to keep the home page and add some great content to it with a link pointing to Website B. As for all the other pages excluding the home page , how is it best to approach them on Website A. Should I redirect them all to the home page of Website A which will tell Google thoose Pages are no longer needed and to prevent the visitors from seeing a 404? My Main aim here is to not lose any visitors to Website A by sending them to Website B but also to hopefully pass any Page strength from Website A to Website B Thanks Adam
Technical SEO | | AMG1000 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Shopify duplicate content issue
We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul
Technical SEO | | devoted2vintage1 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0