Tough SEO problem, Google not caching page correctly
-
My web site is
http://www.mercimamanboutique.com/
Cached version of French version is,
cache:www.mercimamanboutique.com/fr-fr/
showing incorrectly
The German version:
cache:www.mercimamanboutique.com/de-de/
is showing correctly.
I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.
-
It seems you have a system which redirects users to the default page.
When I try the site http://www.mercimamanboutique.com/ - It has a canonical http://www.mercimamanboutique.com/fr-fr/ - when I switch to German and go back to the same url it has a canonical http://www.mercimamanboutique.com/de-de/
The site seems to exist in https & http - may be better to redirect everything to https (although this is probably not related to the issue you encounter)
Dirk
Update: I also noticed that the main rel alternate doesn't exist - http://www.mercimamanboutique.com/ is redirected to http://www.mercimamanboutique.com - I guess it's better not to use url's that are redirected but to use the final destination url
-
Hi Dirk,
By IP are you referring to our server IP address?
It is a strange issue since the French version has all of the correct language sitelinks, only the German version displays mixed language. I will make the modifications that you have suggested earlier and check again after a few days, hopefully this fixes the problem.
-
If you check the Q&A you're not the only on to encounter this problem. In some of these cases it was caused by IP detection systems which automatically selected the country website based on IP (which caused problems because main Googlebot is using a US ip address) - I assume this is not the case with your site?
It's quite possible that it will solve the issue. Easiest way to check is making the modifications and check after a few days if the problem is solved. If not there is probably another issue.
Dirk
-
Thank you Dirk, by fixing the hreflang tags, would this fix the problem with the Sitelinks appearing in the correct language? Currently for the German version of the web site, we are seeing English and German sitelinks and are now unable to demote the English sitelinks from Google removing the feature from Google web master tools. See the below screen shot.
http://i.imgur.com/OBz4qqW.jpg
Thank you again!
-
Sorry for late reply - you also have to add the self referencing version; check the example on https://support.google.com/webmasters/answer/189077?hl=en (so each of the versions of the page will have the same block of hreflang urls)
Apart from that - reciprocal means if page A as hreflang to page B - B should have a hreflang to A. I don't really see why the tool indicates this error as the pages are cross referencing each other - could be caused by the missing self referencing version.
There is a nice tool to generate the hreflang's - http://www.aleydasolis.com/en/international-seo-tools/hreflang-tags-generator/ - you could try it & compare the generated version with what you actually have on your page.
Dirk
-
You have an issue with your hreflang tags, which seems to confuse Google. You can use these tools to check your implementation: https://flang.dejanseo.com.au/ or https://technicalseo.com/seo-tools/hreflang/ - both indicate issues with your implementation. Main issue is the fact that the self referencing hreflang tag is missing (check https://support.google.com/webmasters/answer/189077?hl=en). You could also add a x-default url - for the languages/countries that are not specified - pointing to one of the versions.
Personally I would remove the strict limitations on the de / fr version - why would you send austrian visitors to the .com version and not to the /de-de version -> I would rather use hreflang="de" in this case - idem for fr (unless you don't ship to these countries)
On each of your pages you should have all the tags. Example for your home:
Hope this helps,
Dirk
-
Could you describe what the differences are that you see? When I try to see the cached pages for both of them I don't see any big differences in there that would indicate an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structured data and Google+ Local business page are conflicting
Hi, A few (almost 8 now) months ago we have added structured data to our website. which according to the testing tool should work. (Our url: https://www.rezdy.com) However when searching for our company name, our old local business page from Google+ shows up. I have reached out to google to tell them that we aren't a local business anymore and want the data from the page to be removed. But this all takes painfully long. I want my search result to be shown like the large businesses (examples: Adroll, Hubspot), including logo, twitter feed etc. etc. Will this all work, if so, is there a way to speed up the process, any suggestions?
Technical SEO | | Niek_Dekker1 -
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean?
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean? We have added some text in the pages with keywords thats related the page
Technical SEO | | AlexisWithers0 -
Weird problems with google's rich snippet markup
Once upon a time, our site was ranking well and had all the markups showing up in the results. We than lost some of our rankings due to dropped links and not so well kept maintenance. Now, we are gaining up the rankings again, but the markups don't show up in the organic search results. When we Google site:oursite.com, the markups show up, but not in the organic search. There are no manual actions against our site. any idea why this would happen?
Technical SEO | | s-s0 -
Can I speed up removal of cache for 301'd page on unverified website?
I recently asked another website to remove a page from their website (I have no control over this website) and they have now 301'd this old URL to another - this is just what I wanted. My only aim now is to see the Google cache removed for that page as quickly as possible.
Technical SEO | | Mark_Reynolds
I'm not sure that asking the website to remove the url via WMT is the right way to go and assume I should just be waiting for Google to pick up the 301 and naturally remove the cache. But are there any recommended methods I can use to speed this process up? The old URL was last cached on 3 Oct 2014 so not too long ago. I don't think the URL is linked from any other page on the Internet now, but I guess it would still be in Google's list of URLs to crawl. Should I sit back and wait (who knows how long that would take?) or would adding a link to the old URL from a website I manage speed things up? Or would it help to submit the old URL to Google's Submission tool? URL0 -
How do I influence what page on my site google shows for specific search phrases?
Hi People, My client has a site www.activeadventures.com. They provide adventure tours of New Zealand, South America and the Himalayas. These destinations are split into 3 folders in the site (eg: activeadventures.com/new-zealand, activeadventures.com/south-america etc....). The actual root folder of the site is generic information for all of the destinations whilst the destination specific folders are specific in their information for the destination in question. The Problem: If you search for say "Active New Zealand" or "Adventure Tours South America" our result that comes up is the activeadventures.com homepage rather than the destination folder homepage (eg: We would want activeadventures.com/new-zealand to be the landing page for people searching for "active new zealand"). Are there any ways in influence google as to what page on our site it chooses to serve up? Many thanks in advance. Conrad
Technical SEO | | activenz0 -
Google Structured Data Problem
Hello everyone, About 1-2 weeks ago, I have implemented rich snippets (microdata) for the product pages of my e-commerce site. However, in the web masters tools, google is saying that the crawlers did not detect any structured data in my site. I have also checked my pages using Structured Data Testing Tool. You can see an example test result in the following address. http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.tarzimon.com%2Fproduct%2Fnaif-tasarim-torr-aydinlatma-1031 What may cause this problem? Thank you for your help
Technical SEO | | hknkynr0 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0