Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on websites for multiple countries
-
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
-
"duplicate content is normally not a penalty."
This also depends on just how much there is. If you duplicate a page, that is identical information.
You can get around this problem Jon, by using the HREFLANG markup that will also work across domains, but remember that it works on a per URL basis, so you would need to use this for each of the URL's with the duplicate content.
Have a read of this article from Google on how to use the markup.
https://support.google.com/webmasters/answer/189077?hl=enI hope that helps.
-Andy
-
The first thing to understand is that "duplicate content" is normally not a penalty. It's just that if two of your pages have identical information, then only one of the pages will usually appear in search results. It's not a penalty per se, it's just Google's way of not providing redundant pages in search results. (Note: You can get penalized if you aggressively plagiarize and steal other websites' content and put it on your own -- that is something different.)
In regards to your specific question:
1. Matt Cutts says in this video that what you describe is generally not a problem (because you're not being a spammer who is trying to game the system).
2. I'd review the international best SEO practices described here by Google. Google says you shouldn't worry too much about it, either. But I'd be sure to follow all of these guidelines -- geo-targeting settings for each domain in Webmaster Tools, for example -- in general to "tell" Google that you've got two different TLDs targeting two different countries.
So, having two sites with similar content at .com and.ca should be fine.
Good luck! I hope everything's clear.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What type of website is best for seo.
I need a new website for my health insurance business. What type is best for SEO? Many thanks
Web Design | | laurentjb0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Should i be using shortcodes for my my page content.
Hello, I have a question. Sorry if this is been answered before. Recently I decided to do a little face lift to my main website pages. I wanted to make my testimonials more pretty. Found this great plugin for testimonials which creates shortcodes. I love how it looks like, but just realised that when I use images in shortcodes, these are not picked up by search engines 😞 only text is. Image search ability is pretty important for me and I'm not sure if I should stick with my plain design and upload images manually with all alt tags and title tags or there is a way to adjust shortcode so it shows images to search engines. You can see example here. https://a-fotografy.co.uk/maternity-photographer-edinburgh/ Let me know your thoughts guys. Regards, Armands
Web Design | | A_Fotografy1 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0