Geographic site clones and duplicate content penalties
-
We sell wedding garters, niche I know!
We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions).
To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are...
1. Is this likely to stop me ranking the USA site?
2. Is this likely to harm my UK rankings?
Any thoughts very welcome! Thanks. Mat
-
Well I maybe biased because this is what I wanted to hear but personally I think spot on, particularly the Kissmetrics article from a later response. I have set geo-targeting already and will also sort the HREFLANG tags.
I plan to leave both sites on .com domains - In the UK .com's are just as 'normal' as .co.uk's. All content has been updated to US english and with specific relevant info so I think it's just down to the usual link building and adding content to get it to rank.
I genuinely appreciate all the responses, fantastically useful, thank you!
Mat
-
Hi Dave,
Because it's a bot that's examining the site you need the hreflang & geo-targeting. Algorithms are not perfect, and mistakes do happen, but I am convinced that on the long run you win by staying close to the guidelines (and certainly by putting the benefits of your visitors/customers first).
Personally, I think this whole duplicate content issue is a bit overrated (and I am not only one - check this post on Kissmetrics). In most cases, when finding duplicate content Google will just pick one of the sites to show in the results, and not show the others, unless the duplicate content has a clear intent of spamming. Panda is mainly about thin and/or low quality content, or content duplicated from other sites (without hreflang/geotargeting etc) so I would consider the risk in this case rather low.
There was a discussion on Google product forums which is quite similar to this one (Burberry had a massive traffic drop on it's US site) - and the answer from JohnMu from Google was quite similar to the answer I gave: use geo targeting & hreflang.
rgds,
Dirk
-
I do agree that by the guidelines taken verbatim you could make a good case. My concern is that it's not some guy at Google sitting down and judging sites and asking, "Does this violate the guidelines?" it's a bot and as I'm sure everyone here can attest ... Pandas and Penguin aren't perfect. One can just ask Barry Schwartz of the very credible SE Roundtable about getting hit with a Panda false positive on content issues and about the cost in traffic it causes. Or you can read his post on it here.
Or maybe I'm just paranoid. That could well be.
-
Hi,
I tend to disagree with the answers above. If you check the "official" Google point of view it states: "This (=duplicate content) is generally not a problem as long as the content is for different users in different countries"
So - you should make it obvious that the content is for different users in different countries.
1. Use Webmaster Tools to set the target geography:
- set weddinggarterco.com to UK
- set theweddinggarterco.com to US
You could also consider to put weddinggarterco.com on weddinggarter.co.uk and redirect weddinggarterco.com to the .co.uk version (currently the redirection is the other way round). This way you could leave theweddinggarterco.com without specific geo-target (if you also would target countries like AU)
2. Use the HREFLANG on both sites (on all the pages). You can find a generator here and a tool to check if it's properly implemented here. Other interesting articles on HREFLANG can be found here and here
3. It seems you already adapted a few pages to be more tailored to the US market (shipping, prices) - not sure if you already put the content in US english.
4. I imagine the sites are hosted in the UK. Make sure that the .com version loads fast enough - check both versions on webpagetest.org with US & UK ip's and see if there is a difference in load times. If you're not using it already - consider the use of a CDN
If you do all of the above, you normally should be fine. Hope this helps,
Dirk
-
Hi there.
You can face duplicate content issue. What you can do is to use hreflang or/and canonical links. This would make it all right and would assure that your rankings wouldn't drop.
Cheers.
-
There are always exception to rules but for safety I would highly recommend blocking the .com site until you can get some real unique content on it. It does stand a high chance of taking it's own devaluation (almost certain) and may impact the .co.uk site (and really ... why risk it).
If the scenario was mind I'd have simply built in customized pricing and other relevant information based on IP but if that's not your area (and fair enough as that can get a bit complicated) then the redirection you're doing now to just get them to the right site is the logical option. I'd just block the .com in your robots and put the noindex,nofollow meta in there for good measure and start working on some good unique content and if you won't have time for that - just enjoy your UK rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Launching a new website. Old inherited site cannot be saved after lifted penalty. When should we kill the old site and how?
Background Information A website that we inherited was severely penalized and after the penalty was revoked the site still never resurfaced in rankings or traffic. Although a dramatic action, we have decided to launch a completely new version of the website. Everything will be new including the imagery, branding, content, domain name, hosting company, registrar account, google analytics account, etc. Our question is when do we pull the plug on the old site and how do we go about doing it? We had heard advice that we should make sure we run both sites at the same time for 3 months, then deindex the old site using a noindex meta robots tag.We are cautious because we don't want the old website to be associated in any way, shape or form with the new website. We will purposely not be 301 redirecting any URLs from the old website to the new. What would you do if you were in this situation?
Intermediate & Advanced SEO | | peteboyd0 -
Duplicate Internal Content on E-Commerce Website
Hi, I find my e-commerce pharmacy website is full of little snippets of duplicate content. In particular: -delivery info widget repeated on all the product pages -product category information repeated product pages (e.g. all medicines belonging to a certain category of medicines have identical side effects and I also include a generic snippet of the condition the medicine treats) Do you think it will harm my rankings to do this?
Intermediate & Advanced SEO | | deelo5550 -
Dealing with close content - duplicate issue for closed products
Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.
Intermediate & Advanced SEO | | AymanH0 -
Trying to advise on what seems to be a duplicate content penalty
So a friend of a friend was referred to me a few weeks ago as his Google traffic fell off a cliff. I told him I'd take a look at it and see what I could find and here's the situation I encountered. I'm a bit stumped at this point, so I figured I'd toss this out to the Moz crowd and see if anyone sees something I'm missing. The site in question is www.finishlinewheels.com In Mid June looking at the site's webmaster tools impressions went from around 20,000 per day down to 1,000. Interestingly, some of their major historic keywords like "stock rims" had basically disappeared while some secondary keywords hadn't budged. The owner submitted a reconsideration request and was told he hadn't received a manual penalty. I figured it was the result of either an automated filter/penalty from bad links, the result of a horribly slow server or possibly a duplicate content issue. I ran the backlinks on OSE, Majestic and pulled the links from Webmaster Tools. While there aren't a lot of spectacular links there also doesn't seem to be anything that stands out as terribly dangerous. Lots of links from automotive forums and the like - low authority and such, but in the grand scheme of things their links seem relevant and reasonable. I checked the site's speed in analytics and WMT as well as some external tools and everything checked out as plenty fast enough. So that wasn't the issue either. I tossed the home page into copyscape and I found the site brandwheelsandtires.com - which had completely ripped the site - it was thousands of the same pages with every element copied, including the phone number and contact info. Furthering my suspicions was after looking at the Internet Archive the first appearance was mid-May, shortly before his site took the nose dive (still visible at http://web.archive.org/web/20130517041513/http://brandwheelsandtires.com) THIS, i figured was the problem. Particularly when I started doing exact match searches for text on the finishlinewheels.com home page like "welcome to finish line wheels" and it was nowhere to be found. I figured the site had to be sandboxed. I contacted the owner and asked if this was his and he said it wasn't. So I gave him the contact info and he contacted the site owner and told them it had to come down and the owner apparently complied because it was gone the next day. He also filed a DMCA complaint with Google and they responded after the site was gone and said they didn't see the site in question (seriously, the guys at Google don't know how to look at their own cache?). I then had the site owner send them a list of cached URLs for this site and since then Google has said nothing. I figure at this point it's just a matter of Google running it's course. I suggested he revise the home page content and build some new quality links but I'm still a little stumped as to how/why this happened. If it was seen as duplicate content, how did this site with no links and zero authority manage to knock out a site that ranked well for hundreds of terms that had been around for 7 years? I get that it doesn't have a ton of authority but this other site had none. I'm doing this pro bono at this point but I feel bad for this guy as he's losing a lot of money at the moment so any other eyeballs that see something that I don't would be very welcome. Thanks Mozzers!
Intermediate & Advanced SEO | | NetvantageMarketing2 -
Duplicate content from development website
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched. I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0