How well does Google's crawlers understand foreign websites?
-
I speak 5 languages and therefore have the opportunity to do on-page SEO and content writing for 5 different cultures. This question to me has much to do with the way Google translator works. It doesn't, trust me! Which then makes me wonder how the web crawlers, which are designed with English in mind, can fairly and equally attribute the same ranking points to a foreign website.
Since Google seems to use sematic search technology I'm wondering if foreign sites have it easier or not.
Any ideas?
-
Actually what Russ said is correct. European countries languages, although largely stemming from three different roots (german, latin and slavic), are quite similar in how they work semantically.
In fact, the problems Google may have are related to not European languages, like chinese, japanes or arabic, which are completely different in how they are conceived (just think to chinese or japanes, which are ideograms based). That's also one of the reasons why it took a long time in its history to present a Google version in those languages.
Another thing is if the quality of the regional Googles is high or not, but that could start a long thread
-
A lot of "semantic" search technology is multi-lingual as it looks at factors like word proximity and collocation which still matter regardless of language. Moreover, I am fairly certain Google has teams of developers dedicated to search in multiple languages. While you can expect that they wont be quite as sophisticated in a non-english language search engine, whatever gains you might find in that Google isn't as "smart" simply means that more people can effectively compete against you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a Global Website Rely on Browser Settings for Translation?
Our website serves a global market and over a year ago, we launched 8 language variations of the site and implemented hreflang tags. These language variation pages are proving difficult to maintain, and in Search Console they're triggering thousands of errors. I have double checked our implementation and it's not perfect, so I understand the errors. Here's the question though... the 8 language variations of the site are receiving less than 1% of our web traffic despite 40% of our web traffic coming from countries outside of North America every month. I want to know if we can eliminate the headache of these 8 language variations altogether, remove our attempt at hreflang, and simply rely on the browser settings of the user to dictate what language the website appears in for them? If not, is there a simpler solution than hreflang and attempting to maintain a very large website in 8 languages? Thank for your input! Niki
International SEO | | NikelleClark0 -
Problems with the google cache version of different domains.
We have problems with the google cache version of different domains.
International SEO | | Humix
For the “.nl” domain we have an “.be” cache..
Enter “cache:www.dmlights.nl” in your browser to see this result. Following points are already adapted: Sitemap contains hreflang tag Sitemap is moved to the location www.dmlights.nl/sitemap.xml We checked the DNS configuration Changed the Content language in de response header to : Content-Language: nl-NL Removed the cache with webmastertools Resolved serverrequest errors. Can anyone provide a solution to fix this problem? Thanks, Pieter0 -
Multilingual website - Url problem (sitemap)
At this moment our website both uses the language in the url like "en" and localizes the url itself ("books" in english and "boeken" in dutch). Because of the history of making our website multilingual we have a system that takes the browser language for the localization if the url doesn't contain a language like "en". This means: www.test.com/books = browser language www.test.com/en/books = english language www.test.com/boeken = browser language www.test.com/nl/boeken = dutch language Now for the sitemap this makes it a little troublesome for me because which hreflang is used for which url? 1) The first thing I thought of was using x-default for all urls that get the language of the browser. <code><url><loc>http://www.test.com/books</loc></url></code> But as you can see we now got 2 times x-default. 2) Another solution I thought of was just use the localization of the url to determine the language like: <code><url><loc>http://www.test.com/books</loc></url></code> But now we got 2 of each language for the same page. 3) The last solution I thought of was removing links without a language in the url (except for the homepage, which will still have an x-default) like: <code><url><loc>http://www.test.comen/books</loc></url></code> But for this solution I need to put 301's at pages that are "deleted" and also need to change the system to 301 to the right page. Although the last point isn't really a problem I'm kind of worried that I will lose some of the "seo points" with a 301. (When we changed our domain in the past we had a bad experience with the 301 of our old domain) What do you think would be the best solution for SEO? Or do you have any other suggestions or solutions I haven't thought of.
International SEO | | Anycoin0 -
Google Analytics & Webmaster Tools Filtering
I've just set up a client who uses internationalization on Google Webmaster tools and Google Analytics. For easier management they opted to use subfolders rather than subdomains or cctlds. So I set up a Google Analytic Property, with one Unfiltered profile, and another 3 profiles filtered per language. With the main language English being exclude anything starting /fr/ /de/ as it resides on root. The filters seem to work fine; however after linking this to the Google Webmaster Account to be able to access Search Engine Optimization I do not seem to get any language filtered data. I was wondering if someone had any idea or possible solution to this problem. As I would expect to at least have the Landing Pages if not exactly the keywords filtered by the same criteria that the rest of the data is being filtered. I know there's also an option to create a separate webmaster tools account, however this way I still cannot filter just the English; and I cannot link it to all the separate profiles.
International SEO | | jonmifsud0 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
How do I get a UK website to rank in Dubai?
We are trying to get a UK-based children's furniture website to rank in Dubai. We have had a couple of orders from wealthy expats in Dubai and it seems to be the correct target market. Does anyone have any specific knowledge of this area? We are promoting the same website as for the UK market. Also does anyone know any user behaviour stats on expatriates using search engines? Do they carry on using the version of Google they are used to, or do most change to the local version of Google? Thanks in advance
International SEO | | Wagada0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0 -
Google Webmaster Tools - International SEO Geo-Targeting site with Worldwide rankings
I have a client who already has rankings in the US & internationally. The site is broken down like this: url.com (main site with USA & International Rankings) url.com/de url.com/de-english url.com/ng url.com/au url.com/ch url.com/ch-french url.com/etc Each folder has it's own sitmap & relative content for it's respective country. I am reading in google webmaster tools > site config > settings, the option under 'Learn More': "If you don't want your site associated with any location, select Unlisted." If I want to keep my client's international rankings the way it currently is on url.com, do NOT geo target to United States? So I select unlisted, right? Would I use geo targeting on the url.com/de, url.com/de-english, url.com/ng, url.com/au and so on?
International SEO | | Francisco_Meza0