How best to handle (legitimate) duplicate content?
-
Hi everyone, appreciate any thoughts on this. (bit long, sorry)
Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example)
Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword.
These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc)
Sites share the same template/look and feel too AND are accessed via same IP - just for good measure
So - to questions/thoughts.
1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally.
2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them)
3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant?
4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that?
I think 1- is first thing to do. Anything else? Many thanks.
-
I think your header links will look spammy.
Also, your sharing out our Page Rank to your duplicate sites! I would either remove the links or no follow (are the links of value to your visitors? if not get rid!).
-
Great help here folks, thanks.
One last question if i may - each of the 3 sites links to the other 2 in the header (on every page), so i've got x00 cross-referencing links.
Any value in making them rel=no-follow? Don't want to remove them necessarily.
-
IIS7 supports a type of mod_rewrite. But even if you can't use that, you should have access to ASP or .NET and can easily use those to do your 301s
-
ISS has no problems doing 301s, and if you can use php, asp or anything similar you can just manualy put a 301 on each page if that fails.
No rel-canonical solution will result in all 3 sites ranking as far as I am aware.
Your best option is usualy one site with geo-located pages. If it has to be 3 sites, then the only real option is to make all that content unique, on unique ips e.t.c., which at the end of the day is 3X the work or more.
-
No problem, best of luck and let us know how you get on!
-
Thanks for all the replies everyone. Tricky isn't it?
Moving to 1 site is probably the best medium/long-term option. The 3 sites thing is historical in that sites 2 and 3 were purchased (physically) by the owner over last few years.
Biggest problem with totally new is that (afaik anyway, according to hosting company) i can't 301 old sites to this new site due to the shared hosting issue (using IIS as well, not Apache), so perhaps getting them split out is proper interim measure. (I might be able to do something via WMTools with this though i guess)
Will do some more research into use of canonical cross-domain and attempt the on-page rewrite as well as talking to client about moving sites to unique hosts.
thanks again.
-
why is it hard to restate the content in a different way? reword it. If it's products then change the order and write unique content on the bottom. By east west north south exactly what types of regions are you talking about and why do you need three sites to accomplish this instead of one with geo targeted LPs?
-
you can certainly use the canonical, however you probably wont rank from domains 2 and 3 as your telling Google not to attribute the content to those domains.
I'm still missing the bit where having thee regionalized sites is beneficial to your visitors, why not make one general site with the products and then do some geo-targeted pages?(thats what I would do, makes for a much simpler task).
best of luck with which ever way you go, but come back and let us know what happens
-
The benefit to the user is that they will need to visit physical site to view/purchase and as such, wouldn't click on say, North site (even if it was top 2 or 3) if they were in South.
Are you (both) saying it'd be ok to link rel canonical domain1/page.html on domains 2 and 3? (i.e. different domain names)
Thanks.
-
how is this for good measure?
"Sites share the same template/look and feel too AND are accessed via same IP - just for good measure :)"
Make them as unique and separate as possible. Different templates, different hosting, different email contact, different contact info on domain registration, write content on the page and geo target the wording.
-
What is the benefit to the user for an individual sites for North, South and west?
Are you not just creating a lot of work for yourself, especially since as you state ''would like to rank well for Blue Widgets generally" which ultimately means each site is competing against the others.
I would rethink my strategy, Your more likely to rank 'generally'Â for your chosen terms if you focus your efforts on one site and perhaps use canonical tags on the other two to ensure Google knows who to attribute the content too.
-
There's not too many options here. Geotargeted (even locally) tends to produce duplicate content. The only option, really, is to canonical all your products to one place. If you do it right, you might be able to rank all three sites for your keyword.
You can try #1 but, as you said, it's hard to restate the same content in a non-duplicated way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Â Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Duplicating relevant category content in subcategories. Good or bad for google ranking?
In a travel related page I have city categories with city related information.Â
Intermediate & Advanced SEO | | lcourse
Would you recommend for or against duplicating some relevant city related in subcategory pages. For visitor it would be useful and google should have more context about the topic of our page.
But my main concern is how this may be perceived by google and especially whether it may make it more likely being penalized for thin content. We already were hit end of june by panda/phantom and we are working on adding also more unique content, but this would be something that we could do additionally and basically instantaneously. Just do not want to make things worse.0 -
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0