What SEO considerations for multiple languages on a single page?
-
I am working on a language teaching site for Chinese speakers learning English. I consider myself above average when it comes to basic SEO issues, but all I know here is that Google doesn't like multiple languages on a single page.
Without getting into too many details, both Chinese and English text will appear on the same page with links, tags, phonetic spellings, etc.
I'm hoping someone here knows the science about using the lang="zh" xml:lang="zh" attributes within text and the effects on ranking for text within the declarations. And it'd be great if there was clarification on the link juice passed using the hreflang attribute for both internal and external links. Also, of course, any info on using both English and Chinese characters in the URL would be most helpful. A heads up on any other language specific SEO issues would also be much appreciated.
My goal is to get the most out of both languages per page in terms of ranking.
-
I thought about a subdomain, but I think it would complicate things from an SEO standpoint. From what I've learned, a subdomain here would get treated as a separate domain and send link juice to an English page that wouldn't be accessible to visitors direct (only within another Chinese page as a lightbox). I'm sure there's a cleaner way to do this.
What I'm looking for is ranking effects of using the Lang element or ATTLIST declaration with lang element to identify various languages on the same page. Or if there is any other way to let search engines know I'm using multiple languages on a single page, it'd be a super time saver.
Thanks for all your input so far!
-
Right. If both languages are to be that dominant it could justify the use of a subdomain for the English weighted portions, but it would take some clever coding to get it right.
-
Thanks for the input Ryan. My challenge here is that both languages will be used on nearly all pages site wide. I'll look deeper into the JS lightbox method, but my initial thoughts are that the html would appear in multiple places on the site, and I'm not sure how search engines will treat an English document inside a Chinese document. Of course my priority is usability, but as I go through the design process, I was just hoping to find a way to get the search engines to count content in both languages toward SEO ranking.
-
Since it's native Chinese speakers I'd weight everything towards that priority, i.e. title tags begin in Chinese and then have their English translation. Obviously your going to run into problems with length there, but in other on page areas you should be fine. If it's only one page I'd also lean towards choosing zh as your language setting. One strategy that you could pursue however would be to code two separate, but duplicate pages, one in English, the other in Chinese that are on separate subdomains then as someone goes through the page they could study flash card style with translations being pulled from the other subdomain via a lightbox or something similar. It would be difficult and more work, but you'd also have more ability to really strengthen results, one for English and one for Chinese. Bilingual pages aren't my specialty though. I think a French / Canadian SEO could add some valuable input here as they have English and French as dual official language. Pinging someone from there could be useful, especially in a place like Montreal. Hopefully my above suggestion helps somewhat. Sorry I can't add more input about the science.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Meta tags in Single Page Apps
Since the deprecation of the AJAX Crawling Scheme back last October I am curious as to when Googlebot actually reads meta tag information from a page. We have a website at whichledlight.com that is implemented using emberjs. Part of the site is our results pages (i.e. gu10-led-bulbs). This page updates the meta and link tags in the head of the document for things like canonicalisation and robots, but can only do so after the page finishes loading and the JavaScript has been run.When the AJAX crawling scheme was still in place we were able to prerender these pages (including the modified meta and link tags) and serve these to Googlebot. Now Googlebot no longer uses these prerendered snapshots and instead is sophisticated enough load and run our site.So the question I have is does Googlebot read the meta and links tags downloaded from the original response or does it wait until the page finishes rendering before reading them (including any modifications that have been performed on them)
Technical SEO | | TrueluxGroup1 -
Site splitting value of our pages with multiple variations. How can I fix this with the least impact?
Just started at a company recently, and there is a preexisting problem that I could use some help with. Somebody please tell me there is a low impact fix for this: My company's website is structured so all of the main links used on the nav are listed as .asp pages. All the canonical stuff. However, for "SEO Purposes," we have a number of similar (not exact) pages in .html on the same topic on our site. So, for example, let's say we're a bakery. The main URL, as linked in the nav, for our Chocolate Cakes, would be http://www.oursite.com/chocolate-cakes.asp. This differentiates the page from our other cake varieties, such as http://www.oursite.com/pound-cakes.asp and http://www.oursite.com/carrot-cakes.asp. Alas, fully indexed in Google with links existing only in our sitemap, we also have: http://www.oursite.com/chocolate-cakes.html http://www.oursite.com/chocolatecakes.html http://www.oursite.com/cakes-chocolate.html This seems CRAZY to me, because wouldn't this split our search results 4 ways? Am I right in assuming this is destroying the rankings of our canonical pages? I want to change this, but problem is, none of the content is the same on any of the variants, and some of these pages rank really well - albeit mostly for long tail keywords instead of the good, solid keywords we're after. So, what I'm asking you guys is: How do I burn these .html pages to the ground without completely destroying our rankings for the other keywords? I want to 301 those pages to our canonical nav URLs but, because of the wildly different content, I'm afraid that we could see a heavy drop in search traffic. Am I just being overly cautious? Thanks in advance!
Technical SEO | | jdsnyc20 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Significance of Page speed to SEO?
I am in the middle of optimizing sites for SEO, and am wondering how big of a factor it is to get page load speed under 1.5 seconds? I am prioritizing tasks and I want to know how much this could affect trafiic? Thanks
Technical SEO | | Zachary_Russell1 -
SEO Tomfoolery
Oh Hai, I recently changed the permalink structure on my Wordpress based site, southwestbreaks.co.uk from the standard ?p=123 to a more SEO chummy /%postname%/. As a result, my site has completely dropped off the board for all my previously well ranked search phrases. Having since gotten into SEOmoz a bit more, I can see there are WP plugins available that apparently would've done this a lot more smoothly. I'd be most grateful if someone could explain if this drop off is just temporary, or have I somehow entered Google's shun book? The site has been like this for about 48 hours. Thanks, Tim
Technical SEO | | Southwesttim0