.com and .co.uk duplicate content
-
hi mozzers
I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
-
Just a quick question, the client in question, in their wisdom, decided to put the US website live without telling me and our UK rankings have dropped significantly, do you think the tag will start to fix this?
-
It is unlikely because Google normally gives preference to the original for a fairly long period of time. However with Google there are no certainties but they do get this right in almost all cases I have seen.
The only users you should see decline on your site are non UK visitors as you are telling them with default-x that they should be sent to the .com
There are many huge companies adopting this process and also thousands of other smaller sites, I think Google has ironed out most of the issues over the last 2 years. You are more likely to see a slower uptake on the new domain than the original than the other way around.
Hope that helps
-
Hi Gary,
thanks for the help, as a UK website, we primarily want to rank in the UK but we obviously want to rank in the US. By making the .com website (which is brand new) is this likely to affect our UK rankings or should they be unaffected?
Thanks again,
Karl
-
The actual page you want to look at is https://support.google.com/webmasters/answer/189077
hreflang is the tag you should implement.
I have had long chats with John Mueller at Google about this.
Your setup should be something like this on all pages on both sites.
Within about 7 days depending on the size of your website the .com should appear in favor of the .co.uk for your US based results. For me it happened within an hour!
Setting your .com as a default will be better than setting your co.uk. The co.uk is already a region specific TLD and will not rank well generally in other search engines even if set in the hreflang to do differently.
This will let Google decide where to send traffic too based on their algo/data.
If you use a canonical tag you will be suggesting/pushing US users to the original content instead of the US site.
-
Ok, thanks for the help. I'll have a look into it and see what it says. The .com website is up now and they are hell bent on it staying! I did recommend having a /US but they preferred the .com!
Anyway thanks for the advice!
-
Hiya,
The alternative tag is a good start but you may want to do some more reading I'll put some links below. It's easier to try to make unique content or have a structure like www.example.com/us which may be an easier short term until you've got enough content for a .com site.
http://moz.com/community/q/duplicate-content-on-multinational-sites
https://support.google.com/webmasters/answer/182192#3
I always find it nicer to formulate your own answers and learn a bit along the way so I help the above helps you do that.
-
Thanks Chris,
So would you implement the rel=alternative href=x tag then?
-
A similar question was posted not so long ago there are some great points in it worth a look - http://moz.com/community/q/international-web-site-duplicate-content
Florin Birgu brings some fantastic points up and I'll be they answer your question, if you're still stuck let us know and i'm sure we can help you
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Brainstorming
Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.
Technical SEO | | slidescamp0 -
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
Duplicate Content: Canonicalization vs. Redirects
Hi all, I have a client that I recently started working with whose site was built with the following structure: domain.com
Technical SEO | | marisolmarketing
domain.com/default.asp Essentially, there is a /default.asp version of every single page on the site. That said, I'm trying to figure out the easiest/most efficient way to fix all the /default.asp pages...whether that be 301 redirecting them to the .com version, adding a canonical tag to every .asp page, or simply NOINDEXing the .asp pages. I've seen a few other questions on here that are similar, but none that really say which would be the easiest way to accomplish this without going through every single page... Thanks in advance!0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Testing for duplicate content and title tags
Hi there, I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please? Thanks, Claire
Technical SEO | | SEOvet0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0