Why is GoogleBot crawling our German site and rendering it in English.
-
We have a German website at (http://de.pa.com) and we can't get the search engines to index the site in German language. For some reason the GoogleBot, BingBot, etc are crawling de.pa.com and displaying English text on the SERP. I've tried testing via web-sniffer.net and Google Webmaster tools which both are crawling de.pa.com in English.
We know the page titles/meta descriptions are in English which we are updating to German, but I'm curious to why search engines are indexing our German site and displaying on the SERP as English text when the entire content of the site is in German.
Thank you,
Brian
-
Hi Simon, thanks for the help and noticing that! We actually noticed there were 302 redirects causing issues which we removed on February 5th, so you're right, Google probably hasn't cached the updated version of our site..hopefully!
-
Looking at the site it has been cached on the 1st February and is in english, have you recently changed it over to German?
It seems possibly that Google hasn't cahced the updated version of your site recently, you may just need to wait for it to be re-crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Targeting/Optimising for US English in addition to British English (hreflang tags)
Hi, I wonder if anyone can help? We have an e-commerce website based in the UK. We sell to customers worldwide. After the UK, the US is our second biggest market. We are English language only (written in British English), we do not have any geo-targeted language versions of our website. However, we are successful in selling to customers around the world on a regular basis. We have developers working on a new site due to launch in Winter 2021. This will include a properly managed site migration from our .net to a .com domain and associated redirects etc. Management are keen to increase sales / conversions to the US before the new site launches. They have requested that we create a US optimised version of the site. Maintaining broadly the same content, but dynamically replacing keywords: Example (clothing is not really what we sell): Replacing references to “trainers” with “sneakers”
International SEO | | IronBeetle
Replacing references ‘jumpers with “sweaters”
Replacing UK phone number with a US phone number It seems the wrong time to implement a major overhaul of URL structure, considering the planned migration from .net to .com in the not too distant future. For example I’m not keen to move British English content on to https://www.example.com/en-gb Would this be a viable solution: 1. hreflang non-us visitors directed to the existing URL structure (including en-gb customers): https://www.example.com/
2. hreflang US Language version of the site: https://www.example.com/en-us/ As the UK is our biggest market It is really important that we don’t negatively affect sales. We have extremely good visibility in SERPS for a wide range of high value/well converting keywords. In terms of hreflang tags would something like this work? Do we need need to make reference to en-gb being on https://www.example.com/ ? This seems a bit of a ‘half-way-house’. I recognise that there are also issues around the URL structure, which is optimised for British English/international English keywords rather than US English e.g. https://www.example.com/clothing/trainers Vs. https://example.com/clothing/sneakers Any advice / insight / guidance would be welcome. Thanks.0 -
Google Search Console "International Targeting" is reporting errors that are not present on my site
We are currently handling search for a global brand www.example.com/ which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used hreflang tags. These hreflang tags are implemented only via the XML sitemap across all geo-locations. Under the “Search Analytics -> International Targeting” section, in Google Search Console, for the Malaysian website (www.example.com/my/), there are a number of “no-return tags (sitemaps)” errors arising. For example, for India as a geo-location, there is one ‘en-IN’ – no return tags (sitemaps) errors listed. The error is listed below: Originating URL - www.example.com/my/xyz/ Alternate URL - www.example.com/in/xyz/ When the XML sitemap for the URL – www.example.com/in/ was checked for the hreflang tags, it was noticed that the implementation of hreflang tags for the URL – www.example.com/in/xyz/ was perfectly fine and it was providing a return tag to the URL – www.example.com/my/xyz/. After the code level verification, it was identified that the implementation of hreflang tags was perfectly fine via the XML sitemap. Even though at the code level it was verified that the implementation is fine, the error still persists in Google Search Console. Kindly suggest a solution to this situation, and also advise the effects of these errors on search engine performance
International SEO | | Starcom_Search0 -
My indexed site URL removed from google search without get any message or Manual Actions??
On Agust 2 or 3.. I'm not sure about the exact date...
International SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
How to set up international SEO for english speaking countries
Hi, My company have offices around the world. However they also provide different services and products depending on the region. For example our offices in the USA, UK and Australia all provide different services to each other. My question is, how do I set up my WordPress website up to cater for these different countries and services? I think the simple answer would be to build a separate website for each, but this would be too costly and we don't have the resources to maintain all three. Many thanks for your time, Tom
International SEO | | CoGri0 -
Ranking in Different Countries - Ecommerce site
My client has a .com ecommere site with UK-based serves and he wants to target two other countries (both English speaking). By the looks of it, he wouldn't want to create separate local TLDs targeting each country, I therefore wanted to suggest adding subdomains / subfolders geo-targeted to each country that they want to target, however, I'm worried that this will cause duplicate content issues... What do you think would be the best solution? Any advice would be greatly appreciated! Thank you!
International SEO | | ramarketing0 -
SEO Audit "Hybrid Site"
Hi everyone! I'm trying to analyze a website which is regional in scope. The way the site for every market has been build out is like this : http://subdomain.rootdomain.com/market | http://asiapacific.thisismybrandname.com/ph OR http://subdomain.rootdomain.com/language | http://asiapacific.thisismybrandname.com/en Since this is the first time I'm trying to work on these kinds of sites, I would want to ask for any guidance / tips on how to do about SEO site and technical audit. FYI, the owner of the sites is not giving me access / data to their webmaster account nor their analytics tracking tool. Thanks everyone! Steve
International SEO | | sjcbayona-412180 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0