Handling of Duplicate Content
-
I just recently signed and joined the moz.com system.
During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site.
If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same.
Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system.
How should one manage duplicate content like this? Or should we ignore it?
Out of 1500+ listings on our web site it shows 40 of them are duplicates.
-
Obviously Dirk is right but again you will lose the opportunity to rank in search engines from the related key phrases and if you have played around with real estate industry before, you will have an idea about how difficult it is to rank and what are the advantages of ranking for that particular term.
In my opinion, duplication on page works like when the page is 60 to 70% identical to another page on the website and this is exactly what is happening in your case. I do agree the fact that you cannot change the descriptions but you can actually add the section on the page that explain more about the property. A custom box where you can include your custom written content.
I agree it’s a lot of work at your end but at the end of the day you will get a chance to rank well for those important key phrases that can offer you great amount of conversions.
Just a thought!
-
Nice idea - I have already started this. I just now have to include it for each listing. Thanks!!
-
You could point a canonical to the original source (in fact that is the way Google prefers it). It's a great solution if it's you who's syndicating the content. However, if you would do that, you would loose any opportunity to get ranked on that content.
Googles view: (source: https://support.google.com/webmasters/answer/66359?hl=en).
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results."
The big problem with duplicate content across different domains is that it's up to google to decide which site is going to be displayed. This could be the site which is syndicating the content, but it could also be a site which has the highest authority.
In your case - if possible I would try to enrich the content you syndicate with content from other sources. Examples could be interesting stats on the neighbourhood like avg. age, income, nearby schools, number of house sold & average price...etc or other types of content that might interest potential buyers. This way your content becomes more unique and probably more interesting (and engaging) for your visitors (and for Google)
Hope this helps,
Dirk
-
Pretty much everyone has the same feed. Would it be wise to include the original source. Seeing we are getting the data from REALTOR.ca - point the canonical to where the listing comes from. I am new to this stuff - so I am hoping that I am getting this right.
Thanks T
-
Hi,
This is question which is asked quite often on Moz Q&A. Pages that have a big chunk of source code in common are sometimes considered as duplicated - even if the content is quite different. Recently they did a post on the tech blog on how they identify duplicates (it's quite technical stuff - but still interesting to read - https://moz.com/devblog/near-duplicate-detection/)
If only address & image are different but description is identical - the page will probably be considered as a duplicate by the Moz bot. If it's only for 40 of 1500 listings, I wouldn't worry to much about it, especially because you are unable the content anyway.
I would be more worried if other real estate companies would use the same feed and hence provide exactly the same content on their side, not only the 40 you mention but the full listing.
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Duplicate content on charity website
Hi Mozers, We are working on a website for a UK charity – they are a hospice and have two distinct brands, one for their adult services and another for their children’s services. They currently have two different websites which have a large number of pages that contain identical text. We spoke with them and agreed that it would be better to combine the websites under one URL – that way a number of the duplicate pages could be reduced as they are relevant to both brands. What seamed like a good idea initially is beginning to not look so good now. We had planned to use CSS to load different style sheets for each brand – depending on the referring URL (adult / Child) the page would display the appropriate branding. This will will work well up to a point. What we can’t work out is how to style the page if it is the initial landing page – the brands are quite different and we need to get this right. It is not such an issue for the management type pages (board of trustees etc) as they govern both identities. The issue is the donation, fundraising pages – they need to be found, and we are concerned that users will be confused if one of those pages is the initial landing page and they are served the wrong brand. We have thought of making one page the main page and using rel canonical on the other one, but that will affect its ability to be found in the search engines. Really not sure what the best way to move forward would be, any suggestions / guidance would be much appreciated. Thanks Fraser .
Technical SEO | | fraserhannah0 -
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
Duplicate content / title caused by CAPITALS
What is the best way to stop duplicate content warning (and Google classing them as duplicate content), when it is caused by CAPITALS (i.e www.domain.com/Directory & www.domain.com/directory ). I try to always use lower case (unless a place name then i use Capitals for the first letter), but it looks like i have slipped up and got some mixed up and other sites will also be linking to Capitals Thanks Jon
Technical SEO | | jonny5123790 -
Duplicate content
I have just ran a report in seomoz on my domain and has noticed that there are duplicate content issues, the issues are: www.domainname/directory-name/ www.domainname/directory-name/index.php All my internal links and external links point to the first domain, as i prefer this style as it looks clear & concise, however doing this has created duplicate content as within the site itself i have an index.php page inside this /directory-name/ to show the page. Could anyone give me some advice on what i should do please? Kind Regards
Technical SEO | | Paul780 -
How to resolve duplicate content and title errors?
Hello, I'm new to this resource and SEO. I have taken the time to read other posts but am not entirely sure about the best way to resolve the issues I am experiencing and so am hoping for a helpful hand. My site diagnostics advise me that most of my errors relate to duplicate content and duplicate page titles. Most of these errors seem to relate to our ecommerce product pages. A little about us first, we manufacture and retail over the internet our own line of unique products which can only be purchased through our website. So it’s not so important to make our product pages stand out from competitors. An example of one of our product pages can be found here: http://www.nabru.co.uk/product/Sui+2X2+Corner+Sofa In terms of SEO we are focusing on improving the rankings of our category pages which compete much more with our competitors, but would also like our product pages to be found via a google search for those potential customers that are at the late stage of a buying cycle. So my question: Whilst it would be good to add more content to the product pages, user reviews, individual product descriptions etc (and have good intentions to do this over time, which unfortunately is limited) is there an easy way to fix the duplicate content issues, ensure our products can be found and ensure that the main focus is on our category pages? Many thanks.
Technical SEO | | jannkuzel0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0