Google cache from my website give another website
-
Hello,
Some time ago, I already asked a question here because my homepage disappeared from Google for our main keyword. One of the problems that we showing up was the Google cache.
If you look to the cache of the website www.conseilfleursdebach.fr, you see that it show the content of www.lesfleursdebach.be. It's both our website, but one is focus on France and the other one on Belgium.
Before, there were flags on the page to go to the other country, but in the meantime I removed all links from the .fr to the .be and opposite. This is ongoing since January.
Who has an idea of what can cause this and most of all, what do do?
Kind regards,
Tine
-
The problem is finally resolved. We rewrote some main content on the homepage + changed the menu, so the 2 websites look different now. We had to wait again a few weeks for Google to update, but since today, it's back ok. Thanks for the help.
-
Thanks for your answer. Good that your problem is resolved.
I'm not convinced that the duplicated website is the real trouble. Because a lot of websites work like this, a domain for each country, that's why you can also set a country in the webmaster console, no?
We have the same setup for Belgium and Holland, the same for Germany and Austria.
How to be sure this is the trouble?
It's of course possible to combine the websites but we made it like this because we felt people like to order on a domain that is from their country...
We will consider it anyway...
-
I posted the first reply about how similar my problem was to yours. Mine resolved over a month ago. I'm very sure this is a duplicate content issue. It doesn't matter if they are different servers, markets etc. When you "look" at the sites they "look" the same. Google was taking a default PLESK server page for my domain and showing other sites (3 in total) that also had the same default page. Different servers, different owners and different industries.
I think if you continue to ignore this advice you will have this issue forever. You either need to make the sites different (or at least the home pages) or combine them into a single site and map your existing link juice from one to the other with redirects. You then need to serve the different data based on geolocation.
Depending on your web site, servers, support team etc this could be a very difficult job.
-
Hi,
I wanted to share an update on this topic.
Last week, we changed both websites www.conseilfleursdebach.fr and www.lesfleursdebach.be to HTTPS. Short after that, our homepage of www.conseilfleursdebach.fr was back in Google. I thought the problem was solved. Homepage back, good cache, higher in position.
Unfortunately, by today, the homepage is gone again and the cache of www.lesfleursdebach.be is back for www.conseilfleursdebach.fr.
I also saw in Google search console that a lot of links go between the 2 websites. The strange thing is, I can't find those links... (see screenshot)
This problem is on for months now. Nobody have a solution?
Kind regards,
Tine
-
Hi Remko,
Thanks for your answer. Yes, the geo targetting is set correctly.
Updating IP address (and so a different server?) might be a bit more tricky to test. But maybe it's the only way out??
Kind regards, Tine
-
Hello,
Thanks for your answer. I went through your topic, but most of the things I already checked and gave no solution.
It would be strange that Google suddenly see it as duplicate content as the websites run like this since years and target is set to different countries.
I hope you get answers soon. I will follow your topic as well.
Good luck.
-
Hi Tine,
Did you already updated the geo targetting in GWT, and maybe you can try a different IP and update you content and site structure. Maybe you can also try one domain to target both French and Belgium, instead of separate domains.
-
Hello,
I just noticed your question, it's very similar to mine:
My best guess at the moment it's duplicate content. I think Google has updated how it sees duplicate material and then it's picking one of them as the main source. Your two sites are "Similar". Take a look at my question as it has the steps I have looked at so far. In my situation its picking a site that I don't own and very different IP addresses. I'm at the stage now where I'm trying to track down a real life Google engineer to get the question directly in front of them.
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silly Question still - Because I am paying high to google adwords is it possible google can't rank me high in organic?
Hello All, My ecommerce site gone in penalty more than 3 years before and within 3 months I got message from google penalty removed. Since then till date my organic ranking is very worst. In this 3 years I improved my site onpage very great. If I compare my site with all other competitors who are ranking in top 10 then my onpage that includes all schema, reviews, sitemap, header tags, meta's etc, social media, site structure, most imp speed, google page speed insight score, pingdom, w3c errors, alexa rank, global rank, UI, offers, design, content, code to text raito, engagement rate, page views, time on site etc all my sites always good compare to competitors. They also have few backlinks I do have few backlinks only. I am doing very high google adwords and my conversion rate is very very good. But do you think because I am paying since last 3 year high to google because of that google have some setting or strategy that those who perform well in adwords so not to bring up in organic? Is it possible I can talk with google on this? If yes then what will be the medium of conversation? Pls give some valuable inputs I am performing very much in paid so user end site is very very well. Thanks!
Intermediate & Advanced SEO | | pragnesh96390 -
Ranking on google but not Bing?
Any reason why I could be ranking for Google but not Bing?
Intermediate & Advanced SEO | | edward-may0 -
Getting Your Website Listed
Do you have any suggestiongs? I do not know local websites where I can get some easy backlinks. I guess a record in Google Places.would be great as well. Any sound suggestion will be appreciated. Thanks!
Intermediate & Advanced SEO | | stradiji0 -
How to structure articles on a website.
Hi All, Key to a successful website is quality content - so the Gods of Google tell me. Embrace your audience with quality feature rich articles on your products or services, hints and tips, how to, etc. So you build your article page with all the correct criteria; Long Tail Keyword or phrases hitting the URL, heading, 1st sentance, etc. My question is this
Intermediate & Advanced SEO | | Mark_Ch
Let's say you have 30 articles, where would you place the 30 articles for SEO purposes and user experiences. My thought are:
1] on the home page create a column with a clear heading "Useful articles" and populate the column with links to all 30 articles.
or
2] throughout your website create link references to the articles as part of natural information flow.
or
3] Create a banner or impact logo on the all pages to entice your audience to click and land on dedicated "articles page" Thanks Mark0 -
Canonicalized Website
We are new to SEO MOZ, and as we are doing our evaluation, multiple page problems have arisen. Our domain is www.moxicopy.com and www.moxicopy.com/blog. Our blog is wordpress hosted but integrated into our site. As we ran our analytics from MOZ PRO, we got TONS of Duplicate Page Title and Duplicate Page Content warnings, over 90 each. Most seem to come from our blog and our different products (we are an ecommerce website). Would the canonicalization of the pages be the cause? And couuld someone further explain exactly what canonical/canonicalization is>? I am very confused, and have a feeling that this is what has hurt our site so much in the last 2-3 weeks
Intermediate & Advanced SEO | | Moxicopy.com0 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Random Google?
In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?
Intermediate & Advanced SEO | | Dan-Petrovic0