Duplicate content pages on different domains, best practice?
-
Hi,
We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity.
So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none.
Thanks for taking the time in reading this.
-
Using canonical IS NOT the solution, because if you use canonical the FAQ pages of the canonicalized websites are going to be deindexed.
So, just do it if you really don't care about the traffic those answers can generate for your sites (as you can imagine, this is an ironic suggestion...).
Just use the hreflang, because Google in the last months has become quite smart in understanding that it means that you consider those pages relevant enough for the geo-targeted audiences to not filter them out even if they are substantially identical between country versions.
Said that, try to differentiate the FAQ pages (better localization of the language. i.e.: English UK is slightly different than American English), or even offering a local number for phone inquiries and localized email address for question via email.
In general, it is not a good idea using the crossdomain canonical in International SEO, and it should be used just in exceptional cases.
-
In order to make things easier you can implement hreflang via sitemaps.xml using this tool by Mediaflow: http://www.themediaflow.com/tool_hreflang.php.
-
If your site is based on templates so you can easily put in a header code (Wordpress, Joomla, most CMS, anything but a page-by-page HTML usually) you can insert it directly in by calling in the page itself like this:
" hreflang="x-default" />
" hreflang="en-au" />
" hreflang="en-us" />
" hreflang="en-nz" />This works on Apache servers - this starts with the domain and then request_URI pulls in the page you're on so /about, or /faq and adds the appropriate hreflang tag to that.
Also, when you're done implementing hreflang test it using Flang.
-
As the other users have pointed out, the alternate and hreflang tag would be most ideal. I am in a pickle myself with a very similar issue.
You must note that the alternate tag is to be applied on a page level so every page should resolve to the appropriate URL of it's copy on all other country domains.
So your homepage (.com) could have the following alternate tags:
But on your FAQ page, the alternates would be:
You'll have to rinse and repeat on all 3 sites and for every single page.
Tedious if you ask me! Does anyone know an easier way to go around adding alternate tags to 3 or 4 sites without doing it manually?
The advantage of implementing those however is that you are not canonicalising to one domain which means all your domains stand a chance of performing well in their regions (e.g a search on Google Australia will show the .com.au website).
Again, does anyone have a better approach to this or seen / heard of one? Apart from canonical of course.
-
Hreflang tags are great. I would highly suggest implementing these. Something that I was confused about when I first started using them was that all tags should be on all domains including its own.
For example: firstcountry.com/faq.html should have tags for:
and so on.
You can check that these have been implemented correctly in Google Webmaster Tools under "Search Traffic" -> "International Targeting"
-
I would start by implementing hreflang tags:
https://support.google.com/webmasters/answer/189077?hl=en
Hreflang should take care of these type of issues as Google will associate the right country domain with the content. You may see some overlap for awhile - we've seen hreflang take a bit longer than we'd like to get fully set but once it is, it usually works well.
Short of that, you have 3 options. 1) change the content on all sites to be (somewhat) unique. 2) deindex all but one as you said, 3) canonical, as you said.
1, 2 & 3 all have problems so that's why I would start with hreflang.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Why is there duplicates of my domain
When viewing crawl diagnostics in SEOmoz I can see both "www.website.com" and a truncated version "website.com" is this normal and why is it showing (I do not have duplicates of my site on the server)? E.g.: http://www.klinehimalaya.com/
Technical SEO | | gorillakid
http://klinehimalaya.com/0 -
New Domain Page 7 Google but Page 1 Bing & Yahoo
Hi just wondered what other people's experience is with a new domain. Basically have a client with a domain registered end of May this year, so less than 3 months old! The site ranks for his keyword choice (not very competitive), which is in the domain name. For me I'm not at all surprised with Google's low ranking after such a short period but quite surprsied to see it ranking page 1 on Bing and Yahoo. No seo work has been done yet and there are no inbound links. Anyone else have experience of this? Should I be surprised or is that normal in the other two search engines? Thanks in advance Trevor
Technical SEO | | TrevorJones0 -
High number of Duplicate Page titles and Content related to index.php
It appears that every page on our site (www.bridgewinners.com) also creates a version of itself with a suffix. This results in Seomoz indicating that there are thousands of duplicate titles and content. 1. Does this matter? If so, how much? 2. How do I eliminate this (we are using joomla)? Thanks.
Technical SEO | | jfeld2220 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0