Multi Language websites!
-
Hello,
What is the best way for a website to have more languages?
For example
www.site.com is in english BUT the content will be translated in other 4 languages .es .it .fr .de . Should i buy 4 diferent domains www.site1.es www.site2.it etc... or create folders like site.com/es/ site.com/it etc OR create subdomains es.site.com
Few months ago i was 100% sure..to go and buy diferent domains then to create diferent content on that languages..but now..i am not so sure.
Thank you very much for your help!
-
Thank you for your advice! What you think.. subomains..or subfolders? Todd FOster just answered too and he said:
I mean...what's the best solution?
Thank you so much for your help!
-
If you offer car rental services only in Romania... the use of country level domain names is deceiving from a user point of view.
In fact, if I click on a .fr domain name, I am waiting to see something related to France (a service, product...).
If I was you I would change the domain name of your site to a .com (or .net) and create language or country versions in subfolders, which I will geotarget via Google Webmaster tools to their corresponding countries.
-
Hi Maxime and thank you for your reply. The site is www.auto-rent.ro . Because our clients are not only from Romania, we have made this site in english...even..it's .RO domain ... and a romanian version of the site (www.inchirieri-masini-bucuresti.ro).
Now. we want to create another 4 websites ussing 4 diferent domains .de .fr .es .it each one with unique content for our customers who don't speack english or romanian. Our company offers car rental services ONLY in Romania...so we only want to provide good content to our customers in their language.
This is what we are trying to do...
Thank you and sorry for my english...
-
Again, it depends of your ressources and your strategy...
Can you tell us more about your company and your international strategy?
Would each site be managed by you or by webmasters independantly in each country?
Are you going to use the same brand name in each country?
What ressources (time, staff and budget) to work on the SEO of each site do you have?
And I forgot to mention this article which is worth to read:
International and Multilingual sites: The Criteria to Establish an SEO Friendly Structure
-
I read everything you said guys... Thank you for your advices..but is still very hard to decide. Read this and looks very interesting:
" Having this top-level ccTLD, country code TLD, essentially gives you that extra boost in the search engines when you are targeting those international countries. That is why, generally speaking, if you are a big brand, big site, and you've got a big budget, I would recommend getting these specific ccTLDs, building up those country presences around those language groups. "
subdomains and subfolders..cand be an idea...but not so valuable like the diferent TLD domains
-
One website is easier to maintain. all locations you mentions are different languages so they will all be unique by default so there are no duplicate content issues. Make the English version the main site root and the other regions subfolders.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed/Website Optimization Question
We recently relaunched our website and after running multiple page speed tests (GT Metrix, Google, etc.) our results aren't great. We would love any suggestions on how to improve our site as we are not experts in what exactly these results mean - https://gtmetrix.com/reports/loyalty360.org/DKRN0hKg. Thanks!
Technical SEO | | carlystemmer0 -
Google not indexing my website
Hi guys, We have this website http://www.m-health-expo.nl/ but it is not indexed by google. In webmaster tools google says that it can not fetch the site due to the robots.txt but i do not see any faults in it. http://www.m-health-expo.nl/robots.txt Do you see something strange, it really bothers me.
Technical SEO | | RuudHeijnen0 -
2 websites 1 Google account?
quick question, I have set up my second website purely for seasonal stock so getting it online early ready for when the time is right. But i only have a single web master tools & ad words account, would there be any problem with the single accounts having the details for both websites? or would it be wise to have separate accounts for each site?
Technical SEO | | GarethEJones0 -
Redirect them to some other related pages in your website
dear sir, i found 404 errors in my google adsense account as well google webmaster tool. how can i escape from these errors. and also tell me why can i increase google click rate. my google rate is so low.
Technical SEO | | learningall0 -
How To Proceed When A Portion of Website has been hijacked.
Hi - I've recently learn that a site I manage: http://www.hhisland.com has somehow been hijacked by other sites (examples below): http://wlwhost.info/schering-07-nissan-altima-air-conditioner-drain-clogged/ (460 links)
Technical SEO | | hhdentist
http://abhinav.co.uk/Mary-motorcycles-for-sale-in-iasi/ (440 links)
http://www.turetzky.net/pmr/knifty-knitter-hobo-glove-pattern.html (374 links)
http://safeimail.com/banker-cold-cstrike-16/ (233 links) Just wanted to find out what my best course of action might be? Would changing hosts (or IP address) help in this situation? Thanks!0 -
Frequent updating of pages on website to rank better
Will updating each page often for example everyday or few days, rank the page and/or website better in google. The reason I ask is that I made 18 websites about three months ago and the traffic initially was alot higher and has fallen little by little thru the 3 months. Also how well my pages rank has also fallen. I just put out the websites, have done nothing else. No linking, etc. No updates. It is evident that without new links coming in, the website will fall in rank ie., link aquistion velocity But my question is if I update the pages and change content frequently will this improve my position in google and other search engines. The traffic on websites over the three months if graphed sort of looks like a stair case going down.
Technical SEO | | mickey110 -
Best way to display maintenence mode on a website?
I have a website with lots of traffic and sometimes the backends fail. I want to use lighttpd to show that the website is under mantenence and should be back up shortly. I was thinking of using Soft 503 errors or doing a 302 for every page to /maintenance.html. What would you do (besides fixing the backends, we are already doing that :P) to avoid hurting your SEO efforts? Thanks in advance Mariano
Technical SEO | | marianoSoler980 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0