Can hreflang tags still work when the Alternate URL is 301 redirecting to a translated URL in Japanese Characters?
-
My organization has several international sites 4 of them of which have translated URLs in either Japanese, Traditional Chinese, German & Canadian French.
The hreflang tags we have set up on our United States look something like this:
But when you actually go to http://www.domain.co.jp/it-security/ you are 301 redirected to the translated URL version:
www.domain.co.jp/it-セキュリティ/
My question is, will Google still understand that the translated URL is the Alternate URL, or will this present errors?The hreflang tags are automated for each of our pages and would technically be hard to populate the hreflang with the translated URL version. However we could potentially make the hreflang something customized on a page level basis.
-
Hi there,
You should add in the hreflang annotation the final URL showing the content, not one that is just 301 redirecting to another. Are you seeing errors in the Google Search Console "International Targeting" report? If you see "no return tags" errors there for Japan, then means Google is not being able to identify them.
Thanks!
-
Redirects should be avoided. If the Japanese translation of your page is located at www.domain.co.jp/it-セキュリティ/ then please don't use www.domain.co.jp/it-security/ in your hreflang tag.
If it's hard to specify the correct URL in your page HTML, try providing Hreflang info in sitemaps instead.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
Homepage title tag: "Keywords for robots" vs "Phrases for users"
Hi all, We keep on listening and going through the articles that "Google is all about user" and people suggesting to just think about users but not search engine bots. I have gone through the title tags of all our competitors websites. Almost everybody directly targeted primary and secondary keywords and few more even. We have written a very good phrase as definite title tag for users beginning with keyword. But we are not getting ranked well comparing to the less optimised or backlinked websites. Two things here to mention is our title tag is almost 2 years old. Title tag begins with secondary keyword with primary keyword like "seo google" is secondary keyword and "seo" is primary keyword". Do I need to completely focus on only primary keyword to rank for it? Thanks
Algorithm Updates | | vtmoz0 -
URLs contains other language than English
I am in need of your advice in regards to urls of my new sites. I have got one site from gulf region site is in English and Arabic language. The issue is we are getting url from both. Some are Arabic, do you guys think it will effect the ranking result? url example is : www.mydomain.com/بيع-بي-سيارة
Algorithm Updates | | Mustansar0 -
Increased 404 and Blocked URL Notifications in Webmaster Tools
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools. When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May. The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building. I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic. My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot. Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
Algorithm Updates | | CUnet0 -
So, useless link exchange pages still work?!
After 3 years out of SEO I thought things might have moved on, but apparently not. Bit of back link research and all the top sites in my niche have tons of reciprocal links to barely relevant sites. Do I really have to do this? I mean I thought this was so out of date, it's not much better than keyword stuffing. So, should I just forget my lofty principles asking myself 'is this of any value to my users?' and just take the medicine?
Algorithm Updates | | Cornwall0 -
Dropped off cliff for a partic keyword & can't find out why
At the beginning of Dec we ranked consistently in the top 3 for the keyword 'Suffolk' for the site www.suffolktouristguide.com (apge rank 4, thousands of quality inboud links, site age 5 years +). Since then we've been falling off a cliff and today aren't even in the top 50 for this search term, but most of our othr search terms are unaffected. Our SEOMoz grade remains A for 'Suffolk' and we haven't changed anything in that time that could have had such a material effect (knowingly at least). A similar issue happened to my other site www.suffolkhotelsguide.com back in April and it hasn't recovered despite grade A's on the homepage and key pages. We've checked internal broken links, page download times, external links (used the disavow tool and reconsideration request and got back 'We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google'); etc etc Any thoughts on what I can try next? All suggestions appreciated as I am completely stuck (& have spent a fortune on 'SEO experts' to no effect).
Algorithm Updates | | SarahinSuffolk0 -
Sudden drop after 301 redirection
Hi Experts We did a 301 redirect from an old site to a new site to get rid of any bad link juice. We recently found a big drop in rankings and traffic after google last indexed the new web pages. We did 301 using asp at page level coding. The website had 4000 approx. pages and we did 301 section by section. This is how we did as per one of the blog post in seomoz. Create a sitemap for your old domain. Create content (contact information, description of your company, indication of future plans) and something link worthy for the new domain. (You should start trying to build links early) Setup the new domain and make it live. Register and verify your old domain and new domain with Google Webmaster Tools. Create a custom 404 page for old domain which suggests visiting new domain. Old Domain error checking and fixing In a development environment, test the redirects from the old domain to the new domain. Ideally, this will be a 1:1 redirect. (www.example-old-site.com/category/sexy-mustaches.html to www.example-new-site.com/category/sexy-mustaches.html) 301 redirect your old domain to your new domain. Submit your old sitemap to Google and Bing. The submission pages are within Google Webmaster Tools and Bing Webmaster Center (This step will make the engines crawl your old URLs, see that they are 301 redirects and change their index accordingly.) Fill out the Change of Address form in Google Webmaster Tools. Create a new sitemap and submit it to the engines. (This will tell them about any new URLs that were not present on the old domain) Wait until Google Webmaster Tools updates and fix any errors it is indicated in the Diagnostics section. Monitor search engine results to make sure new domain is being properly indexed. We also did a press release with prweb to announce the new launch. We followed the steps recommended in one of the I am not sure what to do next. Can anyone suggest if its normal to see a drop and we should wait for some time or if we did something wrong? We are loosing business with every single day. Please help !
Algorithm Updates | | ITRIX0 -
Long term plan for a large htaccess file with 301 redirects
We setup a pretty large htaccess file in February for a site that involved over 2,000 lines of 301 redirects from old product url's to new ones. The 'old urls' still get a lot of traffic from product review sites and other pretty good sites which we can't change. We are now trying to reduce the page load times and we're ticking all of the boxes apart from the size of the htaccess file which seems to be causing a considerable hang on load times. The file is currently 410kb big! My question is, what should I do in terms of a long terms strategy and has anyone came across a similar problem? At the moment I am inclined to now remove the 2,000 lines of individual redirects and put in a 'catch all' whereby anything from the old site will go to the new site homepage. Example code: RedirectMatch 301 /acatalog/Manbi_Womens_Ear_Muffs.html /manbi-ear-muffs.html
Algorithm Updates | | gavinhoman
RedirectMatch 301 /acatalog/Manbi_Wrist_Guards.html /manbi-wrist-guards.html There is no consistency between the old urls and the new ones apart from they all sit in the subfolder /acatalog/0