Geographic site clones and duplicate content penalties
-
We sell wedding garters, niche I know!
We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions).
To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are...
1. Is this likely to stop me ranking the USA site?
2. Is this likely to harm my UK rankings?
Any thoughts very welcome! Thanks. Mat
-
Well I maybe biased because this is what I wanted to hear but personally I think spot on, particularly the Kissmetrics article from a later response. I have set geo-targeting already and will also sort the HREFLANG tags.
I plan to leave both sites on .com domains - In the UK .com's are just as 'normal' as .co.uk's. All content has been updated to US english and with specific relevant info so I think it's just down to the usual link building and adding content to get it to rank.
I genuinely appreciate all the responses, fantastically useful, thank you!
Mat
-
Hi Dave,
Because it's a bot that's examining the site you need the hreflang & geo-targeting. Algorithms are not perfect, and mistakes do happen, but I am convinced that on the long run you win by staying close to the guidelines (and certainly by putting the benefits of your visitors/customers first).
Personally, I think this whole duplicate content issue is a bit overrated (and I am not only one - check this post on Kissmetrics). In most cases, when finding duplicate content Google will just pick one of the sites to show in the results, and not show the others, unless the duplicate content has a clear intent of spamming. Panda is mainly about thin and/or low quality content, or content duplicated from other sites (without hreflang/geotargeting etc) so I would consider the risk in this case rather low.
There was a discussion on Google product forums which is quite similar to this one (Burberry had a massive traffic drop on it's US site) - and the answer from JohnMu from Google was quite similar to the answer I gave: use geo targeting & hreflang.
rgds,
Dirk
-
I do agree that by the guidelines taken verbatim you could make a good case. My concern is that it's not some guy at Google sitting down and judging sites and asking, "Does this violate the guidelines?" it's a bot and as I'm sure everyone here can attest ... Pandas and Penguin aren't perfect. One can just ask Barry Schwartz of the very credible SE Roundtable about getting hit with a Panda false positive on content issues and about the cost in traffic it causes. Or you can read his post on it here.
Or maybe I'm just paranoid. That could well be.
-
Hi,
I tend to disagree with the answers above. If you check the "official" Google point of view it states: "This (=duplicate content) is generally not a problem as long as the content is for different users in different countries"
So - you should make it obvious that the content is for different users in different countries.
1. Use Webmaster Tools to set the target geography:
- set weddinggarterco.com to UK
- set theweddinggarterco.com to US
You could also consider to put weddinggarterco.com on weddinggarter.co.uk and redirect weddinggarterco.com to the .co.uk version (currently the redirection is the other way round). This way you could leave theweddinggarterco.com without specific geo-target (if you also would target countries like AU)
2. Use the HREFLANG on both sites (on all the pages). You can find a generator here and a tool to check if it's properly implemented here. Other interesting articles on HREFLANG can be found here and here
3. It seems you already adapted a few pages to be more tailored to the US market (shipping, prices) - not sure if you already put the content in US english.
4. I imagine the sites are hosted in the UK. Make sure that the .com version loads fast enough - check both versions on webpagetest.org with US & UK ip's and see if there is a difference in load times. If you're not using it already - consider the use of a CDN
If you do all of the above, you normally should be fine. Hope this helps,
Dirk
-
Hi there.
You can face duplicate content issue. What you can do is to use hreflang or/and canonical links. This would make it all right and would assure that your rankings wouldn't drop.
Cheers.
-
There are always exception to rules but for safety I would highly recommend blocking the .com site until you can get some real unique content on it. It does stand a high chance of taking it's own devaluation (almost certain) and may impact the .co.uk site (and really ... why risk it).
If the scenario was mind I'd have simply built in customized pricing and other relevant information based on IP but if that's not your area (and fair enough as that can get a bit complicated) then the redirection you're doing now to just get them to the right site is the logical option. I'd just block the .com in your robots and put the noindex,nofollow meta in there for good measure and start working on some good unique content and if you won't have time for that - just enjoy your UK rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Duplicate Content Question...
We are looking for an internal duplicate content checker that is capable of crawling a site that has over 300,000 pages. We have looked over Moz's duplicate content tool and it seems like it is somewhat limited in how deep it crawls. Are there any suggestions on the best "internal" duplicate content checker that crawls deep in a site?
Intermediate & Advanced SEO | | tdawson091 -
Questions about duplicate photo content?
I know that Google is a mystery, so I am not sure if there are answers to these questions, but I'm going to ask anyway! I recently realized that Google is not happy with duplicate photo content. I'm a photographer and have sold many photos in the past (but retained the rights for) that I am now using on my site. My recent revelations means that I'm now taking down all of these photos. So I've been reverse image searching all of my photos to see if I let anyone else use it first, and in the course of this I found out that there are many of my photos being used by other sites on the web. So my questions are: With photos that I used first and others have stolen, If I edit these photos (to add copyright info) and then re-upload them, will the sites that are using these images then get credit for using the original image first? If I have a photo on another one of my own sites and I take it down, can I safely use that photo on my main site, or will Google retain the knowledge that it's been used somewhere else first? If I sold a photo and it's being used on another site, can I safely use a different photo from the same series that is almost exactly the same? I am unclear what data from the photo Google is matching, and if they can tell the difference between photos that were taken a few seconds apart.
Intermediate & Advanced SEO | | Lina5000 -
How would you handle this duplicate content - noindex or canonical?
Hello Just trying look at how best to deal with this duplicated content. On our Canada holidays page we have a number of holidays listed (PAGE A)
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search. From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A. Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A? Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing: Duplicate Content: Block, Redirect or Canonical - SEO Tips0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0