Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice to redirects based on visitors' detected language
-
One of our websites has two languages, English and Italian.
The English pages are available at the root level:
www.site.com/ English homepage www.site.com/page1
www.site.com/page2The Italian pages are available under the /it/ level:
www.site.com/it Italian homepage www.site.com/it/pagina1
www.site.com/it/pagina2When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way...
It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use?
Thank you
-
We've adopted the following solution:
we show the English homepage, but we determine the user's preferred language (from the Accept-Language header sent by the browser). If our site supports that language, we show a temporary balloon that highlights the related link to go to the localized homepage.
Thank you all for your hints and notes.
-
I would stay away from javascript redirects as it can be considered cloaking. Best thing to do is have a page for new visitors (those not having your cookie) and send them to a page that allows them to choose what language they want. You can then set a cookie so when they return it will automatically direct them to the right site.
By not doing any sneaky javascript redirects or IP redirects, you allow google the ability to crawl all the pages of your site and improve indexing, trust, etc etc... Also, I would go into Google webmaster tools and specify the country your /it pages are directed to. This will help in international search and trust from Google.
-
I've done a test with a simple ASP page with a Response.Redirect: <% Response.Redirect "test.htm" %>
This is what Fiddler has catched: HTTP/1.1 302 Object moved Server: Microsoft-IIS/5.1 Date: Thu, 05 May 2011 06:44:10 GMT X-Powered-By: ASP.NET Location: test.htm Content-Length: 121 Content-Type: text/html Cache-control: private <title>Object moved</title>
Object Moved
This object may be found <a href="">here</a>.
I don't think that 302 would be the best solution. As specified in the HTTP specs ( http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html ) wouldn't we prefer a 307 Temporary Redirect?
Thank you
-
You also asked about which 30x redirect to use. I'm also looking for this answer. We currently an ASP header redirect. I don't think this is best, but I'm not sure a 301 redirect can be used. I'd like to hear from others too.
This is what we have now:
lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
real_lang = Left(lang,2)
'Response.Write real_lang
Select case real_lang
case "en"
Response.Redirect "/en"
case "fr"
Response.Redirect "/fr"
case "de"
Response.Redirect "/ge"
case else
Response.Redirect "/en"End Select
-
They automatically redirect people in the uk who type in www.google.com to www.google.co.uk
But, this is different from changing language on a visitor. I'm not sure what google would do if I was in Italy and used my american laptop to visit google.com. I don't think they'd switch me to www.google.it, but maybe someone else has this answer.
Using the browser language settings has worked well for us.
-
You might want to look into what Google do themselves.
They automatically redirect people in the uk who type in www.google.com to www.google.co.uk
If it's good enough for google it's good enough for us. Just make sure you do not look like you are cloaking.
You need to give users the ability to change language when they are on the website though. As Vince mentioned just because a user is visiting the website from Italy it does not mean that they are Italian.
-
Hi Daminao,
I do a redirect based on browser language. I'd stay away from IP/location based redirects. You can have English vistors in Italian locations that would be lost on your pages.
hth,
Vince
-
Hi Damiano,
Matt explained very good in this video and basically he answers all your question.
If you have additional Q. please let me know
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
What are the best practices for geo-targeting by sub-folders?
My domain is currently targeting the US, but I'm building out sub-folders that will need to geo-target France, England, and Spain. Each country will have it's own sub-folder, and professionally translated (domain.com/france). Other than the hreflang tags, what are other best practices I can implement? Can Google Webmaster tools geo-target by subfolder? Any suggestions would be appreciated. Thanks Justin
Intermediate & Advanced SEO | | Rhythm_Agency0 -
Do 404s really 'lose' link juice?
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time. Am I looking at it wrong? ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
Intermediate & Advanced SEO | | friendoffood0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Too many 301 redirects?
Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website. These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website? If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point
Intermediate & Advanced SEO | | Stew2220 -
Best Practice For Company/Client Logo Endorsement
Article: http://searchengineland.com/homepage-sliders-are-bad-for-seo-usability-163496 I came across the following article and somewhat agree with the authors summary.
Intermediate & Advanced SEO | | Mark_Ch
I find sliders a distraction to B2B users and overall offers no SEO benefits. Scenario
As a service provider, over time I have worked with many high profile blue chip comnpanies. As part of my site redesign, I'm looking to show users my client achievements. My initial thoughts are to carry out the following: On the home page I'm looking to incorporate some high profile company logos (similar to http://www.semrush.com) with a hyperlink "more customers" to the right of logo caption. The link will take the user to a dedicated page (www.mydomain.co.uk/customer) showing a comprehensive list of company logos. Questions
#1 Is the above practice good or bad.
#2 Is there a better way to achieve the above Any other practical advise on user experience, social engagement, website speed, etc would be much appreciated. Thanks Mark0