How best to handle (legitimate) duplicate content?
-
Hi everyone, appreciate any thoughts on this. (bit long, sorry)
Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example)
Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword.
These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc)
Sites share the same template/look and feel too AND are accessed via same IP - just for good measure
So - to questions/thoughts.
1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally.
2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them)
3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant?
4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that?
I think 1- is first thing to do. Anything else? Many thanks.
-
I think your header links will look spammy.
Also, your sharing out our Page Rank to your duplicate sites! I would either remove the links or no follow (are the links of value to your visitors? if not get rid!).
-
Great help here folks, thanks.
One last question if i may - each of the 3 sites links to the other 2 in the header (on every page), so i've got x00 cross-referencing links.
Any value in making them rel=no-follow? Don't want to remove them necessarily.
-
IIS7 supports a type of mod_rewrite. But even if you can't use that, you should have access to ASP or .NET and can easily use those to do your 301s
-
ISS has no problems doing 301s, and if you can use php, asp or anything similar you can just manualy put a 301 on each page if that fails.
No rel-canonical solution will result in all 3 sites ranking as far as I am aware.
Your best option is usualy one site with geo-located pages. If it has to be 3 sites, then the only real option is to make all that content unique, on unique ips e.t.c., which at the end of the day is 3X the work or more.
-
No problem, best of luck and let us know how you get on!
-
Thanks for all the replies everyone. Tricky isn't it?
Moving to 1 site is probably the best medium/long-term option. The 3 sites thing is historical in that sites 2 and 3 were purchased (physically) by the owner over last few years.
Biggest problem with totally new is that (afaik anyway, according to hosting company) i can't 301 old sites to this new site due to the shared hosting issue (using IIS as well, not Apache), so perhaps getting them split out is proper interim measure. (I might be able to do something via WMTools with this though i guess)
Will do some more research into use of canonical cross-domain and attempt the on-page rewrite as well as talking to client about moving sites to unique hosts.
thanks again.
-
why is it hard to restate the content in a different way? reword it. If it's products then change the order and write unique content on the bottom. By east west north south exactly what types of regions are you talking about and why do you need three sites to accomplish this instead of one with geo targeted LPs?
-
you can certainly use the canonical, however you probably wont rank from domains 2 and 3 as your telling Google not to attribute the content to those domains.
I'm still missing the bit where having thee regionalized sites is beneficial to your visitors, why not make one general site with the products and then do some geo-targeted pages?(thats what I would do, makes for a much simpler task).
best of luck with which ever way you go, but come back and let us know what happens
-
The benefit to the user is that they will need to visit physical site to view/purchase and as such, wouldn't click on say, North site (even if it was top 2 or 3) if they were in South.
Are you (both) saying it'd be ok to link rel canonical domain1/page.html on domains 2 and 3? (i.e. different domain names)
Thanks.
-
how is this for good measure?
"Sites share the same template/look and feel too AND are accessed via same IP - just for good measure :)"
Make them as unique and separate as possible. Different templates, different hosting, different email contact, different contact info on domain registration, write content on the page and geo target the wording.
-
What is the benefit to the user for an individual sites for North, South and west?
Are you not just creating a lot of work for yourself, especially since as you state ''would like to rank well for Blue Widgets generally" which ultimately means each site is competing against the others.
I would rethink my strategy, Your more likely to rank 'generally' for your chosen terms if you focus your efforts on one site and perhaps use canonical tags on the other two to ensure Google knows who to attribute the content too.
-
There's not too many options here. Geotargeted (even locally) tends to produce duplicate content. The only option, really, is to canonical all your products to one place. If you do it right, you might be able to rank all three sites for your keyword.
You can try #1 but, as you said, it's hard to restate the same content in a non-duplicated way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
Implications of posting duplicate blog content on external domains?
I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?
Intermediate & Advanced SEO | | Visier1 -
Hreflang tag could solve any duplicate content problems on the different versions??
I have run across a couple of articles recently suggesting that using the hreflang tag could solve any SEO problems associated with having duplicate content on the different versions (.co.uk, .com, .ca, etc). here is an example here: http://www.emarketeers.com/e-insight/how-to-use-hreflang-for-international-seo/ Over to you and your technical colleagues, I think ….
Intermediate & Advanced SEO | | JordanBrown0 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0