Best SEO way to implement multi language store
-
Hi,
I have a magento 1.7 multilanguage store with the following structure:
- www.example.com/nl and www.example.com (Dutch)
- www.example.com/uk (English)
- www.example.com/de (German)
As you can see the dutch language basically has two url and this gives problems according to Roger. Both urls show the same page and therefor duplicate content.
Should i 301 www.example.com to www.example.com/nl ?
And would this not cause problems with the indexing because www.example.com is shown when searching for my keywords.I need to have all three languages to be indexed good and used only for the correct countries.
-
Im using a script (maxmind geoip) to send users to the correct storefront based on their ip address.
However i would prefer to stop with this because it can be buggy and slow.I think (correct me if im wrong) that i i set the right geotargeting in google webmaster tools AND set up brand related searches per country in a google adwords campaign to the correct storefront this would be enough and i can stop with the script.
-
Hi Michael - that's right - just create each as a new site. As well as the geotargeting, it'll help if you can get a few links from each country to the relevant version of the site, as well as replacing the Dutch address in the footer with a UK and German address, if you have them.
As for .com or .com/nl, I prefer .com because it looks cleaner. Which would be most user-friendly? Personally I think if someone typed in www.headoverheels-fashion.com and it forwarded to www.headoverheels-fashion.com/nl, and they weren't from the Netherlands, they might be a bit confused. I'm not basing that on any science, it's just an inkling. They may think they can get to an English version of the site by removing the /nl?
-
Hi Alex,
As for the google webmasters part.
I dont think i can set up geotargeting for each subfolder, right?So should i just add the three different sites?
www.headoverheels-fashion.com/nl
www.headoverheels-fashion.com/uk
www.headoverheels-fashion.com/de -
Hi Alex,
I would prefer to 301 the .com to .com/nl
So I would have the .com/nl .com/uk and .com/de indexed.Would this also be a good solution?
-
I agree with Lesley that has a different domain to target each language but if this is not the choice available then use the 301 from domain.com/nl to domain.com in order to reduce to the duplicate content.
Hope this helps!
-
I'd forward http://www.headoverheels-fashion.com/nl/ to http://www.headoverheels-fashion.com - your title and meta description are in Dutch so it's obvious it's a Dutch website in the search results.
I prefer using subfolders rather than separate sites as its easier to manage and the link equity all goes to the same domain, instead of being spread around. It's still possible to geotarget Google Webmaster Tools to the subfolders. I think Google's advice is pretty good on this: https://support.google.com/webmasters/answer/182192?hl=en
-
From experience the absolute best is to use different domains that are targeted to the country. Like site.uk, site.com.au, ect. I have found that besides being expensive, it works really well, just have different content on each domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Latest Info-graphic Best Practices
This is a good "problem" to have. My client is a consortium of web companies. It has just produced a terrific info-graphic on the the latest European data policies. The author is a leading lawyer in the field and the content is outstanding, authoritative, and current. The issue is how to best use the info-graphic for both SEO and general marketing purposes. We will publish the info-grahic on our own site. But what else should we do? (1) The info-graphic already contains the website name. Should we make this a hyperlink on our own site, with or without keyword rich text? (2) We can certainly do handcrafted outreach to likely sites, who might republish. Should we just ask them for backlinks and hope for the best? Or should we embed? (3) We can also send the info graphic as an Eblast to a wider list. Same questio:n Should we just ask them for backlinks and hope for the best? Or should we embed? The site already high DA and a diversified link portfolio. There is no need for risky tactics or "Hail Mary" passes. OTOH, it seems a pity not to make the most of this. Ideas on the above? Other thoughts?
Industry News | | DanielFreedman0 -
Can not get the ranking needle to move even after a year with a US based SEO company
Hello community, I have spent the last 16 months spending lots of money on our USA Based SEO company and on content. For some reason. The powers who may be at Google are not giving us any love. I am beginning to become very dissapointed after all of our efforts. We did have a link penalty a year ago however that has long ago been removed. We have been doing everything our SEO company has asked us but the needle is not moving. Any ideas for us? http://tinyurl.com/n2yoeyn Thank you in advance. Chris
Industry News | | scamper0 -
SEO Company In France
Hi Guys I am currently looking for an SEO company in France. Cant anyone recommend a good reputable agency? Thanks
Industry News | | EwanFisher1 -
Secondshells.com has $3k a month to spend on SEO & marketing. HOW should I spend it & with whom?
We have a Magento based ecommerce site www.secondshells.com and are looking for ways to improve traffic. Our goal is to drive sales of our products. We are in the Android tablet accessory business, since its fairly new the competition is not that great, we would like to get up in organic rankings for our keywords. We have blog updated daily, building backlinks, PR, facebook, twitter, google+, Videos on Youtube, PPC, product feeds to Bing, Google & The Find. The foundation is there but it's a lot to handle in-house. The real question is who to choose to execute the strategy & where to pick them from. There are 1,000's of companies, individuals, Elance & Odesk. Who to choose to execute is more difficult than choosing the direction. We are straightforward and intelligent and want people who can provide a plan of action, execute and report it. All white hat seomoz style of promotion. There are so many choices and avenues to pursue that it is difficult to make a decision as to which way to go. Any guidance or suggestions will be appreciated. SecondShells
Industry News | | BossMike0 -
How to create a SEO Package
Hi, I am looking to create some packages to offer clients. To say that we will do... XXX Backlinks XXX Social Bookmarking XXX Article Submissions etc etc But how do I go about creating these packages, to make sure that they are fair and worthwhile. How do you know how much of each you should offer? Any advice is greatly appreciated. Cheers
Industry News | | MiracleCreative0 -
What is your Biggest SEO selling point to prospective clients?
Typically, our SEO questions are around "how to's" and etc. So, to change it up I will ask a business question: What is your biggest selling point when presenting your SEO services to a new client? In a spirit of transparency, I will tell you mine ahead of time. With PPC, TV, Radio, and Print at some point in time that ad comes to an end. When it ends, that is it. There is no residual from that advertisement - or very minimal at best. With SEO, once you are ranked well and well optimized you continue to get clients for a much longer period of time. With clients who TV and print, this rings especially true and is easily provable. I can't wait to hear yours.
Industry News | | RobertFisher4 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Which one is THE BEST seo book
For you guys which one is the best SEO BOOK?? Maybe this one? http://www.amazon.com/Ranking-Number-One-Essential-Results/dp/1452849900/ref=sr_1_4?ie=UTF8&qid=1309247497&sr=8-4 Thanks in advance!
Industry News | | augustos1