Multilang site: Auto redirect 301 or 302?
-
We need to establish if 301 or 302 response code is to be used for our auto redirects based on Accept-Language header.
https://domain.com
30x > https://domain.com/en
30x > https://domain.com/ru
30x > https://domain.com/deThe site architecture is set up with proper inline HREFLANG.
We have read different opinions about this, Ahrefs says 302 is the correct one:
https://ahrefs.com/blog/301-vs-302-redirects/
302 redirect:
"You want to redirect users to the right version of the site for them (based on location/language)."You could argue that the root redirect is never permanent as it varies based on user language settings (302)
On the other hand, the lang specific redirects are permanent per language:IF Accept-Language header = en
https://domain.com > 301 > https://domain.com/en
IF Accept-Language header = ru
https://domain.com > 301 > https://domain.com/ruSo each of these is 'permanent'.
So which is the correct?
-
Hi Guys i am newer to SEO. my web with the 301 302 issue. anyone able help to fix it? will be high appreciated.
Internal URL
Yes
Full URL
https://myoffroadled.com/led-rocker-switch
HTTP status code
301 Moved permanently
First found at
Seed URL
Content type
text/html; charset=utf-8
Depth
0
Redirect URL
https://www.myoffroadled.com/led-rocker-switch/
Not crawled
Is redirect loop
No
Page rating
100 -
@fj66doneoiddpj In some cases users may have a reason for using the site in something other than their browser language.
For example, perhaps a shared device, or simply wishing to check a different localisation.
If your hreflang is working correctly, they should never see the "wrong" version anyhow.
But, there are pros and cons of course.
-
Thank you for the reply, 302 is also what we decided after many hours of reading.
The main reason for auto redirect is that we feel it's a better user experience as user will arrive the content target to his/her browser language. -
I would use a 302 here, just as an added insurance in case you ever accidentally serve Google the redirect.
However, I'm not a huge fan of auto redirects in general - they tend to cause issues for crawlers and users alike. Can you explain a bit more when/why the user gets redirected? Perhaps instead you could serve a country selection insterstitial if a user seems to be on the "wrong" variant?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tracking International Keywords
Hi I haven't had much luck tracking keywords on my international sites let alone discovering new keywords. What are some strategies/suggestions to accomplishing this? Currently I have campaigns set-up for the UK, Germany, France, and China in additional to our main US-Canada campaign.
International SEO | | Julie.P0 -
Put a 301 from /main to home page, now I'm panicking
Hi, Our website is 10 years old, but I only noticed last night we had a https://curveball-media.co.uk/main page which has some badly formatted copy on. I redirected (301) to the home page https://curveball-media.co.uk/ Then I had a slight panic that maybe this was the wrong thing to do and it should be like it was with the home page and the /main page. Should I have left it or did I do the right thing?
Intermediate & Advanced SEO | | curveballmedia0 -
Using .ag for agriculture site with global targeting
Would using .ag with a short punchy domain like farm.ag, that was targeting a global audience be a wise decision? Versus say an 11 character descriptive ".com". Is there any benefit to using a ".ag" if the site is for agriculture? Note, this is a heavy content site so SEO important, with plans to serve different languages later.
International SEO | | mag7770 -
Google Search Console "International Targeting" is reporting errors that are not present on my site
We are currently handling search for a global brand www.example.com/ which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used hreflang tags. These hreflang tags are implemented only via the XML sitemap across all geo-locations. Under the “Search Analytics -> International Targeting” section, in Google Search Console, for the Malaysian website (www.example.com/my/), there are a number of “no-return tags (sitemaps)” errors arising. For example, for India as a geo-location, there is one ‘en-IN’ – no return tags (sitemaps) errors listed. The error is listed below: Originating URL - www.example.com/my/xyz/ Alternate URL - www.example.com/in/xyz/ When the XML sitemap for the URL – www.example.com/in/ was checked for the hreflang tags, it was noticed that the implementation of hreflang tags for the URL – www.example.com/in/xyz/ was perfectly fine and it was providing a return tag to the URL – www.example.com/my/xyz/. After the code level verification, it was identified that the implementation of hreflang tags was perfectly fine via the XML sitemap. Even though at the code level it was verified that the implementation is fine, the error still persists in Google Search Console. Kindly suggest a solution to this situation, and also advise the effects of these errors on search engine performance
International SEO | | Starcom_Search0 -
US site vs New Canadian site for Brand
Hi Everyone, My company decided to create a Canadian site for Canadian customers. How do I slowly transition the US site for ranking in Google.ca? I was thinking of using robots.txt to block Google.Ca from crawling the US site? Can anyone provide some advice oh how this should be managed? Thank you!
International SEO | | JMSCC0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Why would a site lose rankings in U.S while maintaining rankings in other English locations (Canada & Australia)
What would cause a site to lose ranking in the U.S while maintaining top (1st page) positions in other English results countries such as Canada or Australia? Is this purely penguin related because of location of backlinks or are there other significant factors that could be in play? Would this rule out Panda as a cause because it's simply an "English language" targeted algo and not location dependent like backlinks (penguin)? Appreciate any insights
International SEO | | ResumeGenius0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0