Tough SEO problem, Google not caching page correctly
-
My web site is
http://www.mercimamanboutique.com/
Cached version of French version is,
cache:www.mercimamanboutique.com/fr-fr/
showing incorrectly
The German version:
cache:www.mercimamanboutique.com/de-de/
is showing correctly.
I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.
-
It seems you have a system which redirects users to the default page.
When I try the site http://www.mercimamanboutique.com/ - It has a canonical http://www.mercimamanboutique.com/fr-fr/ - when I switch to German and go back to the same url it has a canonical http://www.mercimamanboutique.com/de-de/
The site seems to exist in https & http - may be better to redirect everything to https (although this is probably not related to the issue you encounter)
Dirk
Update: I also noticed that the main rel alternate doesn't exist - http://www.mercimamanboutique.com/ is redirected to http://www.mercimamanboutique.com - I guess it's better not to use url's that are redirected but to use the final destination url
-
Hi Dirk,
By IP are you referring to our server IP address?
It is a strange issue since the French version has all of the correct language sitelinks, only the German version displays mixed language. I will make the modifications that you have suggested earlier and check again after a few days, hopefully this fixes the problem.
-
If you check the Q&A you're not the only on to encounter this problem. In some of these cases it was caused by IP detection systems which automatically selected the country website based on IP (which caused problems because main Googlebot is using a US ip address) - I assume this is not the case with your site?
It's quite possible that it will solve the issue. Easiest way to check is making the modifications and check after a few days if the problem is solved. If not there is probably another issue.
Dirk
-
Thank you Dirk, by fixing the hreflang tags, would this fix the problem with the Sitelinks appearing in the correct language? Currently for the German version of the web site, we are seeing English and German sitelinks and are now unable to demote the English sitelinks from Google removing the feature from Google web master tools. See the below screen shot.
http://i.imgur.com/OBz4qqW.jpg
Thank you again!
-
Sorry for late reply - you also have to add the self referencing version; check the example on https://support.google.com/webmasters/answer/189077?hl=en (so each of the versions of the page will have the same block of hreflang urls)
Apart from that - reciprocal means if page A as hreflang to page B - B should have a hreflang to A. I don't really see why the tool indicates this error as the pages are cross referencing each other - could be caused by the missing self referencing version.
There is a nice tool to generate the hreflang's - http://www.aleydasolis.com/en/international-seo-tools/hreflang-tags-generator/ - you could try it & compare the generated version with what you actually have on your page.
Dirk
-
You have an issue with your hreflang tags, which seems to confuse Google. You can use these tools to check your implementation: https://flang.dejanseo.com.au/ or https://technicalseo.com/seo-tools/hreflang/ - both indicate issues with your implementation. Main issue is the fact that the self referencing hreflang tag is missing (check https://support.google.com/webmasters/answer/189077?hl=en). You could also add a x-default url - for the languages/countries that are not specified - pointing to one of the versions.
Personally I would remove the strict limitations on the de / fr version - why would you send austrian visitors to the .com version and not to the /de-de version -> I would rather use hreflang="de" in this case - idem for fr (unless you don't ship to these countries)
On each of your pages you should have all the tags. Example for your home:
Hope this helps,
Dirk
-
Could you describe what the differences are that you see? When I try to see the cached pages for both of them I don't see any big differences in there that would indicate an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website's Home Page is Missing on Google SERP
Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers
Technical SEO | | ugorayan0 -
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?
Technical SEO | | DML-Tampa0 -
Page speed in relation to SEO
I cannot seem to find any information about this, so I thought I would try to get a few people's opinion. How do you think pagespeed is measured in terms of Google using it as a ranking factor? Do you think they use their internal Pagespeed app? Something during the crawl? Your GA site speed?
Technical SEO | | LesleyPaone0 -
Google has deindexed 40% of my site because it's having problems crawling it
Hi Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper). The site i'm talking about is http://www.gazetaexpress.com/ We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't? In the screenshot attached to this post you will see how Google Webmasters is reporting these errors. In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem? If you need more details feel free to ask. I will appreciate any help. Thank you in advance C43svbv.png?1
Technical SEO | | Bajram.Kurtishaj1 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
How do I keep Google from crawling my PPC landing page?
Question, I don't want Google to catch my PPC landing pages, but I'm wondering if I make the page no-follow, no-index will it still crawl my landing page for quality score? Just to clarify I do want it to crawl the landing page to boost my quality score on PPC, but I do not want it to index it for SEO. Thanks 🙂
Technical SEO | | jhinchcliffe0 -
Trying to get on Google page one for keyword "criminal defense attorney san diego". What can I do?
I'm trying to help a friend who is an attorney get on page one for the keyword "criminal defense attorney san diego." So far I've changed his title and description tags since they weren't optimized before. (SERP shows old title tag, however I submitted a XML sitemap through Webmaster tools to get the new title tags updated.) He also had a few duplicate pages, but I took care of that with some 301 redirects. I also added a h1 tag, alt image tag, and more content. I also spent a few hours building links for him. He currently has a page authority of 52 and domain authority of 44 with a decent amount of links pointing to his site. I'm wondering why he's stuck on page 4, when his competitors that have less impressive numbers seem to show up on page 1. I did look at his link profile using OSE and I'm worried that his old SEO guy got him spam links. His website is www.nasserilegal.com, however the page I was focusing on was www.nasserilegal.com/criminal.html Any advice would be great.
Technical SEO | | micasalucasa0 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0