Tough SEO problem, Google not caching page correctly
-
My web site is
http://www.mercimamanboutique.com/
Cached version of French version is,
cache:www.mercimamanboutique.com/fr-fr/
showing incorrectly
The German version:
cache:www.mercimamanboutique.com/de-de/
is showing correctly.
I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.
-
It seems you have a system which redirects users to the default page.
When I try the site http://www.mercimamanboutique.com/ - It has a canonical http://www.mercimamanboutique.com/fr-fr/ - when I switch to German and go back to the same url it has a canonical http://www.mercimamanboutique.com/de-de/
The site seems to exist in https & http - may be better to redirect everything to https (although this is probably not related to the issue you encounter)
Dirk
Update: I also noticed that the main rel alternate doesn't exist - http://www.mercimamanboutique.com/ is redirected to http://www.mercimamanboutique.com - I guess it's better not to use url's that are redirected but to use the final destination url
-
Hi Dirk,
By IP are you referring to our server IP address?
It is a strange issue since the French version has all of the correct language sitelinks, only the German version displays mixed language. I will make the modifications that you have suggested earlier and check again after a few days, hopefully this fixes the problem.
-
If you check the Q&A you're not the only on to encounter this problem. In some of these cases it was caused by IP detection systems which automatically selected the country website based on IP (which caused problems because main Googlebot is using a US ip address) - I assume this is not the case with your site?
It's quite possible that it will solve the issue. Easiest way to check is making the modifications and check after a few days if the problem is solved. If not there is probably another issue.
Dirk
-
Thank you Dirk, by fixing the hreflang tags, would this fix the problem with the Sitelinks appearing in the correct language? Currently for the German version of the web site, we are seeing English and German sitelinks and are now unable to demote the English sitelinks from Google removing the feature from Google web master tools. See the below screen shot.
http://i.imgur.com/OBz4qqW.jpg
Thank you again!
-
Sorry for late reply - you also have to add the self referencing version; check the example on https://support.google.com/webmasters/answer/189077?hl=en (so each of the versions of the page will have the same block of hreflang urls)
Apart from that - reciprocal means if page A as hreflang to page B - B should have a hreflang to A. I don't really see why the tool indicates this error as the pages are cross referencing each other - could be caused by the missing self referencing version.
There is a nice tool to generate the hreflang's - http://www.aleydasolis.com/en/international-seo-tools/hreflang-tags-generator/ - you could try it & compare the generated version with what you actually have on your page.
Dirk
-
You have an issue with your hreflang tags, which seems to confuse Google. You can use these tools to check your implementation: https://flang.dejanseo.com.au/ or https://technicalseo.com/seo-tools/hreflang/ - both indicate issues with your implementation. Main issue is the fact that the self referencing hreflang tag is missing (check https://support.google.com/webmasters/answer/189077?hl=en). You could also add a x-default url - for the languages/countries that are not specified - pointing to one of the versions.
Personally I would remove the strict limitations on the de / fr version - why would you send austrian visitors to the .com version and not to the /de-de version -> I would rather use hreflang="de" in this case - idem for fr (unless you don't ship to these countries)
On each of your pages you should have all the tags. Example for your home:
Hope this helps,
Dirk
-
Could you describe what the differences are that you see? When I try to see the cached pages for both of them I don't see any big differences in there that would indicate an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Cache issue
Hi, We’ve got a really specific issue – we have an SEO team in-house, and have had numerous agencies look at this – but no one can get to the bottom of this. We’re a UK travel company with a number of great positions on the search engines – our brand is www.jet2holidays.com. If you try ‘Majorca holidays’, ‘tenerife holidays’, ‘gran canaria holidays’ etc you’ll see us in the top few positions on Google when searching from the UK. However, none of our destination pages (and it’s only the destination pages), show a ‘cached’ option next to them. Example: https://www.google.com/search?q=majorca+holidays&oq=majorca+holidays&aqs=chrome..69i57j69i60l3.2151j0j9&sourceid=chrome&ie=UTF-8 This isn’t affecting our rankings, but we’re fairly certain it is affecting our ability to be included in the Featured Snippets. Checked and there aren’t any noarchive tags on the pages, example: https://www.jet2holidays.com/destinations/balearics/majorca Anyone have any ideas?
Technical SEO | | fredgray0 -
How do I best optimize my on-page SEO for a magazine-style wordpress theme?
My Wordpress website is set up with a magazine style theme (Newspaper). Maybe that's the issue overall here. Questions: 1) Pages vs Categories vs Posts I currently have a category with a few dozen posts under it. The category page itself has a ~1000 word article on it. It paginates every 10 posts or so at the bottom, but most of the page is duplicate because it's only swapping out a few links. Should I instead make the "category" a page with the posts childed under it? What's the best way to go about that? 2) Canonical and Pagination I get errors about a ton of duplicate content for paginated categories and my author page (all posts are under the admin account, which has ~40 pages or so. Every page is just a list of posts and it bitches about duplicate Titles and Descriptions on every one of the paginated posts). Should I canonical these back to the root author? Same question regarding pagination for categories, assuming I'm not going to be switching them to Pages. 3) Home Page Links Right now my home page just shows a few links to the top posts of all time. After that, it shows the 5 newest posts. On the sidebar it lists a few random pages/posts. There are also a few "category listings" which just shows random posts relevant to that category. Do I want something more static/structured? The navbar does list most main content pages under their appropriate category, but the home page itself is pretty much dynamic.
Technical SEO | | searchspot0 -
Will getting backlinks to landing page from low quality sites negatively affect SEO?
I've recently started an initiative at my company to get our customers to publish a blog post about our company and to include a link to a landing page which sits on a subdomain attached to our main domain. The reason for directing visitors to the post to a landing page is to help with conversion. I've recently been thinking that couldn't the backlinks to this landing page from our customers' blogs (generally small sites) have a negative impact on the overall SEO of my companies domain? Thanks in advance.
Technical SEO | | JustinButlion0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Google Alerts almost never alerts me to my own pages being added.
Hello All, So i have a fairly decent blog http://www.symbolphoto.com/bl*g/ * replace with o. However, i'm posting to it once/twice a week and i never ever see in my google alerts my pages being included. I do include my search terms in my pages "Bston Wedding Photgrapher" yet, my page is never included. What on earth am i doing wrong? Any advice would be greatly appreciated! -Brendan
Technical SEO | | symbolphoto0 -
Does page size and relative content position affect SEO?
Good morning, Each product page of our e-commerce site consists of a fairly lengthy header and footer. The former of which contains links to ~60 product categories, the logo, etc, while the latter contains information such as the latest posts from our blog, links to support, etc. The main "content" of the page is of course product related information, which also happens to contain a bit of templated data such as links which when clicked open respective sliders containing information regarding our return and shipping policies. The question: We wonder whether the relative "size" of the page has anything to do with SEO results. As an example, suppose the page header consists of 20% of the total page size, the important page-specific content consumes 60%, and the footer consumes the final 20%. Is this relevant? Or to rephrase the question: Should we be concerned about keeping our headers and footers as small as possible? Thanks!
Technical SEO | | FondriestEnv0 -
Correct Indexing problem
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
Technical SEO | | kicksetc0 -
Local SEO for service industry - one landing page for every town...in every county...in every state?
Starting a second local based service site. Initially going to target a couple counties and move on from there as the business grows. The first site of mine I set up a page for each town [service] + [town] + [state] + [zip]. I am afraid this could get out of control though if I don't have unique content on each page. For the last site I simply copied the page and replace the town name in each as well as the picture, picture title, and image name to make it look more unique for users but not necessarily Google. I had pretty good results but I want this next site to be done properly. Should I only target a few of the major markets to begin with? What about long tail searches for smaller towns that currently bring in a good amount of business? I am concerned about having "too many" long tail pages for each town which would essentially become a listing of every town and county in the state if I was to maintain the pace I want to. Also I would need a good amount of backlinks to each specific town page url if I wanted to do well in each of those specific markets right? Is this where the fine line between niche term and broad search is? Is there any happy medium?
Technical SEO | | kabledesigns0