.com and .co.uk duplicate content
-
hi mozzers
I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
-
Just a quick question, the client in question, in their wisdom, decided to put the US website live without telling me and our UK rankings have dropped significantly, do you think the tag will start to fix this?
-
It is unlikely because Google normally gives preference to the original for a fairly long period of time. However with Google there are no certainties but they do get this right in almost all cases I have seen.
The only users you should see decline on your site are non UK visitors as you are telling them with default-x that they should be sent to the .com
There are many huge companies adopting this process and also thousands of other smaller sites, I think Google has ironed out most of the issues over the last 2 years. You are more likely to see a slower uptake on the new domain than the original than the other way around.
Hope that helps
-
Hi Gary,
thanks for the help, as a UK website, we primarily want to rank in the UK but we obviously want to rank in the US. By making the .com website (which is brand new) is this likely to affect our UK rankings or should they be unaffected?
Thanks again,
Karl
-
The actual page you want to look at is https://support.google.com/webmasters/answer/189077
hreflang is the tag you should implement.
I have had long chats with John Mueller at Google about this.
Your setup should be something like this on all pages on both sites.
Within about 7 days depending on the size of your website the .com should appear in favor of the .co.uk for your US based results. For me it happened within an hour!
Setting your .com as a default will be better than setting your co.uk. The co.uk is already a region specific TLD and will not rank well generally in other search engines even if set in the hreflang to do differently.
This will let Google decide where to send traffic too based on their algo/data.
If you use a canonical tag you will be suggesting/pushing US users to the original content instead of the US site.
-
Ok, thanks for the help. I'll have a look into it and see what it says. The .com website is up now and they are hell bent on it staying! I did recommend having a /US but they preferred the .com!
Anyway thanks for the advice!
-
Hiya,
The alternative tag is a good start but you may want to do some more reading I'll put some links below. It's easier to try to make unique content or have a structure like www.example.com/us which may be an easier short term until you've got enough content for a .com site.
http://moz.com/community/q/duplicate-content-on-multinational-sites
https://support.google.com/webmasters/answer/182192#3
I always find it nicer to formulate your own answers and learn a bit along the way so I help the above helps you do that.
-
Thanks Chris,
So would you implement the rel=alternative href=x tag then?
-
A similar question was posted not so long ago there are some great points in it worth a look - http://moz.com/community/q/international-web-site-duplicate-content
Florin Birgu brings some fantastic points up and I'll be they answer your question, if you're still stuck let us know and i'm sure we can help you
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
Javascript tabbed navigation and duplicate content
I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!
Technical SEO | | Total-Design-Shop0 -
UK and US Targeting Simultaneously - Domain Setup and Duplicate Content?
I have a site that will be targeting the US and the UK. However, it will need to display slightly different content to the two. Should I use a .co.uk and a .com, or uk.themainsite.com for the UK, or themainsite.com/UK? This is of course setting up multiple country targeting within Google Webmaster Tools. Am I likely to run into duplicate content issues?
Technical SEO | | james4060 -
Duplicate content - wordpress image attachement
I have run my seomoz campaign through my wordpress site and found duplicate content. However, all of this duplicate content was either my logo or images and no content with addresses like /?attachement_id=4 for example . How should I resolve this? thank you.
Technical SEO | | htmanage0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0 -
Mapping Internal Links (Which are causing duplicate content)
I'm working on a site that is throwing off a -lot- of duplicate content for its size. A lot of it appears to be coming from bad links within the site itself, which were caused when it was ported over from static HTML to Expression Engine (by someone else). I'm finding EE an incredibly frustrating platform to work with, as it appears to be directing 404's on sub-pages to the page directly above that subpage, without actually providing a 404 response. It's very weird. Does anyone have any recommendations on software to clearly map out a site's internal link structure so that I can find what bad links are pointing to the wrong pages?
Technical SEO | | BedeFahey0