Duplicate content on sites from different countries
-
Hi, we have a client who currently has a lot of duplicate content with their UK and US website.
Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension.
Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world?
Any statement from Google about this?
Regards,
Bill
-
You could use hreflang in this instance. It is suitable for content on separate domains as this FAQ link attests. I would steer clear of using the canonical when using hreflang. Check out this previous thread on Moz where use of hreflang and canonical was discussed by me and others.
-
The issue with using Rel=Canonical in this situation is that Google treats that directive as a 301. If you canonical a whole site to another you will end up devaluing one of the sites.
-
Hi Bill,
Google claims that this shouldn't be an issue "as long as the content is for different users in different countries. But if you're not careful about this, it will affect your pagerank if search engine crawlers treat your content on both sites as duplicates, since most of them are looking for original content.
To be on the safe side and help search engines understand better, you should include a rel="canonical" link in pages that have same or similar content. Once detected, Google will only index one of the versions for their search results. Here's a guide from Google that will help you: https://support.google.com/webmasters/answer/139394
Hope that helps!
-
Hi Bill here you...
http://googlewebmastercentral.blogspot.in/2010/03/working-with-multi-regional-websites.html
and here: https://support.google.com/webmasters/answer/182192?hl=en
Hope those help.
Best,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
installed PageSpeed Module on our server but no difference to site
Hi
Intermediate & Advanced SEO | | Direct_Ram
I have been searching for an answer for a while now and couldnt find it so maybe someone has had a similar problem. We have installed PageSpeed Module on our server. The administrator has said it is active and has run a test below: [root@mydomain ~]# curl -D- https://www.mydomain.com/ | head -10
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
14 102k 14 15029 0 0 40506 0 0:00:02 --:--:-- 0:00:02 64780HTTP/1.1 200 OK
Server: nginx/1.6.0
Date: Fri, 10 Apr 2015 11:28:43 GMT
Content-Type: text/html
Content-Length: 104885
Connection: keep-alive
Set-Cookie: ci_session=BGANYlg8VmsPLgN1AWABMldkAGUGLVZwVmhQdQd0CGIEaFI6VgkEOQdmUSYHbQZyXz9TZVE4Vm4CIwxnB2hYbAZrAGUHZQg%2BUjUFOgRlUWAEYg05WDxWMg82A2ABOQEzV2IAaQZsVjBWPFA2BzEIaAQ%2FUjBWNwRmBztRJgdtBnJfP1NnUTpWbgIjDDoHflhSBjwAMgdjCHlSNAVwBHdRJwQ6DStYM1ZgD2YDPAF4ATJXZABmBiFWMVY%2FUD4HKQg5BDRSelZnBGAHIFE%2FByUGO180U2ZRMFZ2AnQMIAdrWH8GAgA3B2AIblI%2FBXcEJlE%2BBHINYlg4VmAPZwM8AXgBYFchAC0GY1YsVjpQKAc2CDIEKVJjVnYEeAd6UTwHYAZeXzNTYlEnViYCZAw3B2ZYbAYpAHsHawhiUj8FdgR8USgEZg02WHxWeA91A2oBMwFhVzcAKgZ9Vm9WIlAxBykIOgQ%2BUnpWYQRwB0xRVwcFBi5fNlN4UTtWYgIvDGEHIFg%2BBn0AFAdmCHhSOAVgBCRRQARCDRtYKVYrDzkDbwE4ASxXZQBxBj1WLVY%2BUCYHawhiBGVSPVYyBD4HLVE1B3gGMF89U3ZRZlY9AmMMIAd9WGUGbwB5BzYIJVJlBS0ENlEnBDoNK1gzVmAPZgM8AXgBb1c1ACwGe1ZcVmxQZQdzCGIEcVI9ViIEKQcgUT8HPwY7XzRTYlE4VmwCNwxlBztYPgZvAGUHPAh4UmsFOgQ%2BUScEdA0rWGxWIw8KA2IBOwF3VzUAfQY0VnBWN1A2Bz0IKQQlUm9WKw%3D%3D; expires=Fri, 10-Apr-2015 13:28:43 GMT; path=/
Set-Cookie: ci_session=a%3A0%3A%7B%7D; expires=Thu, 10-Apr-2014 21:28:43 GMT; path=/
Set-Cookie: ci_session=BWEFalk4UWwJKFIq; expires=Fri, 10-Apr-2015 13:28:43 GMT; path=/
X-Mod-Pagespeed: 1.9.32.3-4448 But there doesn't seem to be any difference to the sites speed or change in google speed test recommendations. I do not have much knowledge on servers but the server company has assured me it is active and all the filters are on - so not sure why I am not seeing anything different. if anyone has any advise on this it would be great. thanks E0 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0 -
Should I robots block site directories with primarily duplicate content?
Our site, CareerBliss.com, primarily offers unique content in the form of company reviews and exclusive salary information. As a means of driving revenue, we also have a lot of job listings in ouir /jobs/ directory, as well as educational resources (/career-tools/education/) in our. The bulk of this information are feeds, which exist on other websites (duplicate). Does it make sense to go ahead and robots block these portions of our site? My thinking is in doing so, it will help reallocate our site authority helping the /salary/ and /company-reviews/ pages rank higher, and this is where most of the people are finding our site via search anyways. ie. http://www.careerbliss.com/jobs/cisco-systems-jobs-812156/ http://www.careerbliss.com/jobs/jobs-near-you/?l=irvine%2c+ca&landing=true http://www.careerbliss.com/career-tools/education/education-teaching-category-5/
Intermediate & Advanced SEO | | CareerBliss0