Duplicate content pages on different domains, best practice?
-
Hi,
We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity.
So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none.
Thanks for taking the time in reading this.
-
Using canonical IS NOT the solution, because if you use canonical the FAQ pages of the canonicalized websites are going to be deindexed.
So, just do it if you really don't care about the traffic those answers can generate for your sites (as you can imagine, this is an ironic suggestion...).
Just use the hreflang, because Google in the last months has become quite smart in understanding that it means that you consider those pages relevant enough for the geo-targeted audiences to not filter them out even if they are substantially identical between country versions.
Said that, try to differentiate the FAQ pages (better localization of the language. i.e.: English UK is slightly different than American English), or even offering a local number for phone inquiries and localized email address for question via email.
In general, it is not a good idea using the crossdomain canonical in International SEO, and it should be used just in exceptional cases.
-
In order to make things easier you can implement hreflang via sitemaps.xml using this tool by Mediaflow: http://www.themediaflow.com/tool_hreflang.php.
-
If your site is based on templates so you can easily put in a header code (Wordpress, Joomla, most CMS, anything but a page-by-page HTML usually) you can insert it directly in by calling in the page itself like this:
" hreflang="x-default" />
" hreflang="en-au" />
" hreflang="en-us" />
" hreflang="en-nz" />This works on Apache servers - this starts with the domain and then request_URI pulls in the page you're on so /about, or /faq and adds the appropriate hreflang tag to that.
Also, when you're done implementing hreflang test it using Flang.
-
As the other users have pointed out, the alternate and hreflang tag would be most ideal. I am in a pickle myself with a very similar issue.
You must note that the alternate tag is to be applied on a page level so every page should resolve to the appropriate URL of it's copy on all other country domains.
So your homepage (.com) could have the following alternate tags:
But on your FAQ page, the alternates would be:
You'll have to rinse and repeat on all 3 sites and for every single page.
Tedious if you ask me! Does anyone know an easier way to go around adding alternate tags to 3 or 4 sites without doing it manually?
The advantage of implementing those however is that you are not canonicalising to one domain which means all your domains stand a chance of performing well in their regions (e.g a search on Google Australia will show the .com.au website).
Again, does anyone have a better approach to this or seen / heard of one? Apart from canonical of course.
-
Hreflang tags are great. I would highly suggest implementing these. Something that I was confused about when I first started using them was that all tags should be on all domains including its own.
For example: firstcountry.com/faq.html should have tags for:
and so on.
You can check that these have been implemented correctly in Google Webmaster Tools under "Search Traffic" -> "International Targeting"
-
I would start by implementing hreflang tags:
https://support.google.com/webmasters/answer/189077?hl=en
Hreflang should take care of these type of issues as Google will associate the right country domain with the content. You may see some overlap for awhile - we've seen hreflang take a bit longer than we'd like to get fully set but once it is, it usually works well.
Short of that, you have 3 options. 1) change the content on all sites to be (somewhat) unique. 2) deindex all but one as you said, 3) canonical, as you said.
1, 2 & 3 all have problems so that's why I would start with hreflang.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage "personalisation" - different content for different users
Hi Mozians, My firm is looking to present different content to different users depending on whether they are new, return visitors, return customers etc... I am concerned how this would work in practice as far as Google is concrened- how would react to the fact that the bot would see different content to some users. It has the slight whiff of cloacking about it to me, but I also get that in this case it would be a UX thing that would genuinely be of benefit to users, and clearly wouldn't be intended to manipulate search rankings at all. Is there a way of acheiving this "personalisation" in such a way that Google understands thay you are doint it? I am thinking about some kind of markup that "declares" the different versions of the page. Basically I want to be as transparent about it as possible so as to avoid un-intended consequences. Many thanks indeed!
Technical SEO | | unirmk0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Best practice for rich snippet product data - which page shows up?
We have a website with thousands of pages that rank locally for a specific service we offer. What I'd like to do is add rich snippets to these pages. I'd like to setup the services we offer as 'products' in the rich snippets, so that our 2 services show up below the url as rich snippets. I guess I'm not sure if the markup is supposed to be on the product page itself, or if I should use the offerurl tag, to create a separate page on the site whose only purpose is to have a long list of the services we offer pointing to the local pages as the offer url's. What do I do with this page? what are best practices for this offer aggregator? Are there any resources I can look at? Am I even doing this right? I'm new to having markup pages, and I'm hoping that the markup code doesn't actually need to be on the product offer page itself, but that the product offer page is the one that shows up on the results - that is my last question actually - which page will show up? the offerurl link, or the actual markup page.
Technical SEO | | ilyaelbert0 -
Duplicate Page Titles
I had an issue where I was getting duplicate page titles for my index file. The following URLs were being viewed as duplicates: www.calusacrossinganimalhospital.com www.calusacrossinganimalhospital.com/index.html www.calusacrossinganimalhospital.com/ I tried many solutions, and came across the rel="canonical". So i placed the the following in my index.html: I did a crawl, and it seemed to correct the duplicate content. Now I have a new message, and just want to verify if this is bad for search engines, or if it is normal. Please view the attached image. i9G89.png
Technical SEO | | pixel830 -
Duplicate Page Titles and Content
I have a site that has a lot of contact modules. So basically each section/page has a contact person and when you click the contact button it brings up a new window with form to submit and then ends with a thank you page. All of the contact and thank you pages are showing up as duplicate page titles and content. Is this something that needs to be fixed even if I am not using them to target keywords?
Technical SEO | | AlightAnalytics0 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0 -
Htm vs. aspx page extensions & duplicate content
We have a client whose site is fairly new. There isn't much in the way of SEO results so far. In their content management system they have implemented friendly URLs and changed the extensions from aspx to htm. Now the htm pages are all indexed in Google but when I run a campaign report in SEOmoz it shows that all pages are duplicated with there being both htm and aspx pages for each page. Should we do 301 redirects from the aspx pages to the htm pages? Or would we be safe by removing the htm pages and letting Google reindex the site with the aspx page extensions? Does Google have any kind of preference as to what the page extensions are as long as the URLs include keywords?
Technical SEO | | IvieDigital0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0