One site, one location, multiple languages - best approach?
-
Hey folks,
Has anyone created a multilingual site targeted at a single location? I have a site that I need to create which is targeting users in Spain. There are going to need to be English and Spanish versions of the text.
My thoughts would be to handle it this way:
1. Geolocate the entire site to spain
2. Have the english content in a folder /en/
3. Have the spanish content in a folder /es/
As far as I am aware the same content in another language is not considered duplicate content and Google should handle folks searching in spanish or english and show them the correct landing page.
Sounds easy enough in principle but I also have these other options to seemingly solidify the approach:
4. Add: rel="alternate" hreflang="x" (3)
5. Add language information to a sitemap (4)
Again, none of that seems terribly difficult but would welcome any feedback and particularly experience of multilingual sites targeting a single location.
Thanks all
Marcus
References and info
1. Multi Regional:
http://googlewebmastercentral.blogspot.co.uk/2010/03/working-with-multi-regional-websites.html2. Multi Language:
http://googlewebmastercentral.blogspot.co.uk/2008/08/how-to-start-multilingual-site.html3. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
4. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
-
You're welcome Marcus
-
Hey Gianluca
The site is going to be on an .es domain and it is for a small local business that serves Spanish and English speakers (ex pats).
This is really useful though:
"Hence, if you are only interested in Spain for whatever reason, than the hreflang should be set es "ES-es" and "ES-en" (Spain-Spanish and Spain-English)."
Thanks again for the input, I actually read one of your posts on here when I was gearing up to figure out the best way to do this - for anyone else interested this is a great read:
http://www.seomoz.org/blog/international-seo-dropping-the-information-dust
Thanks for the input all.
Marcus
-
Hey, thanks.
-
Hola Marcus,
everything is correct in your approach.
The only thing you didn't tell is if you are going to use an .es domain or a generic one.
Saying because a .es domain name would automatically target Spain as main country (so, no need to geotarget yourself the site on GWT).
Yes, when content is in two languages, you don't fall - quite obviously - into the duplicated content issue. Said that, it is better implementing the hreflang in order to present the correct es or en URLs to the users depending on the language set in their browsers.
Note that doing so you will target differently all the the searches done in Spanish and in English, let's say, for instance, that you will "optimized" also for searches done in Spanish from Argentina.
Hence, if you are only interested in Spain for whatever reason, than the hreflang should be set es "ES-es" and "ES-en" (Spain-Spanish and Spain-English).
Finally, you can use hreflang or with the code solution or with the sitemaps solution. Using both is redundant.
-
Please explain better the "parameter" thing.
Said that, I would prefer using the subfolder option, which is the cleaner one.
-
Hi Marcus, your approach is very solid, the hreflang and the language information in sitemaps solidifies the /en/ and /es/ folders. Google is smart enough to recognize such different languages but you'll do a well job if you add those tags too.
You can also consider to add a parameter instead of a folder and then specify in GWT what is the functionality of that parameter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and My site ranki Drops in my site..Please I need your help
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and Drops in my site..Please I need your help
On-Page Optimization | | zizutz0 -
Is there a way to find out who is trying to damage my site
Hi, to cut the story short, i hired a seo guy to do work on my site, paid a lot of money, but then when he was let go all the links to the site were stripped and other work had vanished. I want to know if there is a way to prove this and to also check on any other damage that he could be doing at the moment to my site. The site was running high in the search engines and then dropped a short while after he was let go.
On-Page Optimization | | in2townpublicrelations0 -
Question about multiple versions of home page
Hi guys, I'm having a question, answer to which i'm unable to find anywhere. I browesed the whole internet lol. So the question is about multipple versions of the home page. In particular, i want to know how should i deal with home page URLs with this extention: /index.html All the rest possible versions of home page i know how to deal with but this one "/index.html" i don't. I did add a cononical tag to it but i'm wondering whether or not i should add 301 redirect to chosen version og the home page (let's say it is www.mydomain.com). Please advise the best practices on how to deal with this. Thanks in advance!
On-Page Optimization | | odmsoft0 -
Site appears to rank very low
Hi, A site we manage is ranking very low for it's main key phrases. The site is www.moremouse.com. For example for the phrase "orlando vacation rentals" it ranks around page 12 which seems very low considering the DA, PA, links, etc. compared to many sites ranking much much higher. Can anyone see anything obvious that is causing it to rank so low? Thanks Pete
On-Page Optimization | | QbicIS0 -
Best practice for Portfolio Links
I have a client with a really large project portfolio (over 500 project images), which causes their portfolio page to have well over the 100 links that are recommended. How can I reduce this without reducing the number of photos they can upload?
On-Page Optimization | | HochKaren0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Suggestions on plans to optimize my site? (NOOB)
I am currently trying to plan how to optimize my site based on keywords. I read and I understand site architecture and usability http://www.seomoz.org/blog/site-architecture-for-seo , but I am still somewhat confused about how to target each keyword per page or when http://www.seomoz.org/img/upload/splitting-keyword-targeted-.gif Let me give you an example. We build databases for SME's using 3 different technologies. One of them is MS Access. Based on PPC campaigns and keyword research some of the possible keywords might be ms access programmer ms access consultants access database experts According to the link provided, should these be separate pages? I feel if they were, our site nativigation would be cluttered and clients would not be benefiting from them at all. It might even lead to some redundant data which I believe is bad right? My feeling is to make one page and target one keyword, but I'm not sure. For example, see one of our top ranking competitors http://www.justgetproductive.com/content/access-programmer/index.php Please, look at the footer? Is that actually how I should structure my links? I hope the answer is NO! Then again, if I do just have one page targeting one keyword, what do I do about the others? Do I just try to use blog posts/articles addressing those keywords? Do I not target them at all? Thanks for any advice, please keep in mind I am just getting started. My approach is to create a plan to outline everything before I put a lot of time into it.
On-Page Optimization | | emcacace1 -
SEO Site Planning Tool?
Does anyone know of a good SEO Site planning tool? I see that SEOBOOK has something that looked interesting but they want $300/mo! Thanks in advance! Andy
On-Page Optimization | | MaxOtto0