How to manage duplicate content?
-
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries.
The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate.
What strategies exist to ensure that I'm not suffereing as a result of this content?
Should I :
-
Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole?
-
Link back to the clients site to indicate that they are the original source
Any suggestions?
-
-
Some things to check:
If your search results are being indexed is there only one URL per keyword combination?
Avoid showing the same content on multiple URLs - or restrict the SE bots to just one of them per keyword.e.g. check you don't have a structure like this:
domain.name/search/location=xyz
domain.name/search/location=xyz&keyword=abc
domain.name/location/keywordThe same applies to the detail pages of your real estate listings. I.e. don't let the SEs see content on all of these URLs:
domain.name/location/listing_id
domain.name/keyword/listing_id
domain.name/listing_id
domain.name/listing_slugRather than making duplicate content noindex I would prefer to redirect it to a common URL if possible.
To reduce duplicate content issues with client material, add extra info to the pages. E.g. some things to try are:
- adding region/suburb info to search results pages
- pre-parse listings and extract key info on features/facilities/etc., then display that key info in a features box or something so that both the HTML and content differs a lot from client sites.
-
The results pages do have unique meta tags that are dynamically constructed(due to large amount) for onpage SEO and the results pages are rewritten to static urls for indexing.
My results pages actually don't do too badly considering but not sure if the dup content negatively impacts the whole site by way of some unique content vs. dup content sitewide ratio or something.
I encourage users to create a unique 200 character summary for the search results which does help but with over 10000 listings, I think it may be a challenge to get a copywriter to cover it all. Another downside is taking on new clients who may hav a portfolio of 100's of properties. To get them onboard we either create a crawler to retrieve from their site or use a XML document generated by them and distributed to our site and those of our competitors.
I'm hoping we won't be punished for the dup content we just won't rank for it which is fine, but that's just a guess.
We can write unique content though our copyrighter but would it not be better to create a few paragraphs of unique content for each results page. Granted it will take a long time to cover all the pages but the focus would be on improving the ration of unique content vs dup content.
-
Is there a way you can add some unique content? Perhaps employ a copywriter to write unique blurbs about the listings? Certainly there are details that will always be the same with you and your competitors as housing listings will always need to have certain details.
For your search results pages, have you optimized the title and meta descriptions on them? This may also help with crafting different pages from your competitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Duplicate page content errors stemming from CMS
Hello! We've recently relaunched (and completely restructured) our website. All looks well except for some duplicate content issues. Our internal CMS (custom) adds a /content/ to each page. Our development team has also set-up URLs to work without /content/. Is there a way I can tell Google that these are the same pages. I looked into the parameters tool, but that seemed more in-line with ecommerce and the like. Am I missing anything else?
Intermediate & Advanced SEO | | taylor.craig0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0 -
Wordpress Duplicate Content Due To Allocating Two Post Categories
It looks like google has done a pretty deep crawl of my site and is now showing around 40 duplicate content issues for posts that I have tagged in two seperate categories for example: http://www.musicliveuk.com/latest-news/live-music-boosts-australian-economy http://www.musicliveuk.com/live-music/live-music-boosts-australian-economy I use the all in one SEO pack and have checked the no index for categories, archive, and tag archive boxes so google shouldn't even crawl this content should it? . I guess the obvious answer is to only put each post in one category but I shouldn't have to should I? Some posts are relevant in more than once category.
Intermediate & Advanced SEO | | SamCUK0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0 -
Duplicate content on index.htm page
How do I avoid duplicate content on the index.htm page . I need to redirect the spider from the /index.htm file to the main root of http://www.manandhisvan.com.au and hence avoid duplicate content. Does anyone know of a foolproof way of achieving this without me buggering up the complete site Cheers Freddy
Intermediate & Advanced SEO | | Fatfreddy0