Google cache from my website give another website
-
Hello,
Some time ago, I already asked a question here because my homepage disappeared from Google for our main keyword. One of the problems that we showing up was the Google cache.
If you look to the cache of the website www.conseilfleursdebach.fr, you see that it show the content of www.lesfleursdebach.be. It's both our website, but one is focus on France and the other one on Belgium.
Before, there were flags on the page to go to the other country, but in the meantime I removed all links from the .fr to the .be and opposite. This is ongoing since January.
Who has an idea of what can cause this and most of all, what do do?
Kind regards,
Tine
-
The problem is finally resolved. We rewrote some main content on the homepage + changed the menu, so the 2 websites look different now. We had to wait again a few weeks for Google to update, but since today, it's back ok. Thanks for the help.
-
Thanks for your answer. Good that your problem is resolved.
I'm not convinced that the duplicated website is the real trouble. Because a lot of websites work like this, a domain for each country, that's why you can also set a country in the webmaster console, no?
We have the same setup for Belgium and Holland, the same for Germany and Austria.
How to be sure this is the trouble?
It's of course possible to combine the websites but we made it like this because we felt people like to order on a domain that is from their country...
We will consider it anyway...
-
I posted the first reply about how similar my problem was to yours. Mine resolved over a month ago. I'm very sure this is a duplicate content issue. It doesn't matter if they are different servers, markets etc. When you "look" at the sites they "look" the same. Google was taking a default PLESK server page for my domain and showing other sites (3 in total) that also had the same default page. Different servers, different owners and different industries.
I think if you continue to ignore this advice you will have this issue forever. You either need to make the sites different (or at least the home pages) or combine them into a single site and map your existing link juice from one to the other with redirects. You then need to serve the different data based on geolocation.
Depending on your web site, servers, support team etc this could be a very difficult job.
-
Hi,
I wanted to share an update on this topic.
Last week, we changed both websites www.conseilfleursdebach.fr and www.lesfleursdebach.be to HTTPS. Short after that, our homepage of www.conseilfleursdebach.fr was back in Google. I thought the problem was solved. Homepage back, good cache, higher in position.
Unfortunately, by today, the homepage is gone again and the cache of www.lesfleursdebach.be is back for www.conseilfleursdebach.fr.
I also saw in Google search console that a lot of links go between the 2 websites. The strange thing is, I can't find those links... (see screenshot)
This problem is on for months now. Nobody have a solution?
Kind regards,
Tine
-
Hi Remko,
Thanks for your answer. Yes, the geo targetting is set correctly.
Updating IP address (and so a different server?) might be a bit more tricky to test. But maybe it's the only way out??
Kind regards, Tine
-
Hello,
Thanks for your answer. I went through your topic, but most of the things I already checked and gave no solution.
It would be strange that Google suddenly see it as duplicate content as the websites run like this since years and target is set to different countries.
I hope you get answers soon. I will follow your topic as well.
Good luck.
-
Hi Tine,
Did you already updated the geo targetting in GWT, and maybe you can try a different IP and update you content and site structure. Maybe you can also try one domain to target both French and Belgium, instead of separate domains.
-
Hello,
I just noticed your question, it's very similar to mine:
My best guess at the moment it's duplicate content. I think Google has updated how it sees duplicate material and then it's picking one of them as the main source. Your two sites are "Similar". Take a look at my question as it has the steps I have looked at so far. In my situation its picking a site that I don't own and very different IP addresses. I'm at the stage now where I'm trying to track down a real life Google engineer to get the question directly in front of them.
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If Fetch As Google can render website, it should be appear on SERP ?
Hello everyone and thank you in advance for helping me. I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP). Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties! I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host! Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem. If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP? mokaab_serp.png
Intermediate & Advanced SEO | | hamoz10 -
Google disavow file
Does anybody have any idea how often Google reads the disavow file?
Intermediate & Advanced SEO | | seoman100 -
International website. Di I need a new website
i am looking to expand from the UK and open a location in the US. i curretly have a .co.uk domain. what would you recommend I do with th website, create a new one wth a .com domain?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Backlinks from one website to my 3 websites (hosted in 1 c-block) ?
We are making some linkbuilding. And have very nice backlinks offer. So we are planning to put our 3 websites in it. Our 3 websites are on separate IP, but same C-block. Can it be a red flag for google? Can i put my 3 backlinks in one blog post?
Intermediate & Advanced SEO | | bele0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590 -
Am I buying links according to Google?
I have the opportunity to sponsor a variety of sections in a variety of .edu sites. Really appealing since they will both provide high quality traffic as well as to help our rankings... (maybe 🙂 )... Anyway this opportunity involves a monetary exchange, no different than advertising in Adwords and/or buying a display ad with the NYT. The links will be both text and banner... With follow links. My questions to you guys are: Is this practice penalize? And will display ads pass link juice also? Thanks for the help...
Intermediate & Advanced SEO | | dhidalgo10 -
Google giving me only partial site links?
Hi Guys, My site is #1 ranked for the term "waiting till marriage," but Google only gives me partial site links. See "Forums - Articles - Questions - Videos" links in attached screenshot. How do I get the full, page-dominating, mini-description-having site links? Any suggestions? Note: I've got a ton of content and decent traffic, but I haven't put much time into developing back links yet. I'm a php developer, but I'm new to professional-level SEO. Any help would be hugely appreciated. Also, sorry about the inflammatory nature of the site. It's not a preachy site; it's just a support group. Hope it doesn't offend. partial-sitelinks.png
Intermediate & Advanced SEO | | MikeAM270 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0