SEO effect of content duplication across hub of sites
-
Hello,
I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites.
In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys:
1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively?
2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication?
3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content?
Thanks in advance,
Graham
-
Duplicate content doesn't tend to burn a website out unless there is aggressive scraping going on as well as other balck hat signals. It sounds like the bigger question you're asking is how can the site be made to have unique content when it, along with many others, are pulling the same MLS content. This was asked about a year ago here: http://moz.com/community/q/real-estate-mls-listings-does-google-consider-duplicate-content, and the general consensus remains the same: try to find a way to make your content unique.
-
Hi Graham,
Here are a few insights that hopefully help you out:
-
The site could be penalized individually or as a hub depending on the severity of the duplicate content.
-
Use a tool like Penguin Tool from Baracude Digital to check for penalty.
http://www.barracuda-digital.co.uk/panguin-tool/ -
Real estate is notorious for duplicate content. If you are doing SEO for a real estate website, your first move should be to restrict access to the listings and then try to identify other types of content you can use to rank. Also, remove as much boilerplate as possible and find something unique, at least a couple of sentences, to write about your listings. Add unique content to pages where duplicate content exists. Add entirely new pages and frequent blog posts rich in original content.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Is SEO as Effective on AJAX Sites?
Hey Everyone, I had a potential client contact me about doing SEO for their site and I see that they have an AJAX site where all the content is rendered dynamically via AJAX. I've been doing SEO for years, but never had a client with an AJAX site. I did a little research and see how you can setup alternative pages (or snapshots as Google calls them) with the actual content so the pages are crawlable and will get indexed, but I'm wondering if that is as effective as optimizing static HTML pages or if Google treats AJAX page alternatives as less trustworthy/valuable. Also, does having the site in AJAX effect link building and social sharing? With the link structure, it seems there could be some issues with pointing links and passing link juice to internal pages Thanks! Kurt
Intermediate & Advanced SEO | | Kurt_Steinbrueck1 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | | Jenny10 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
How can we improve the seo on our site?
Hello everyone. I have been reading through this site for a while and tried to put everything together that I have learned so far. Would any of you mind looking at our site and providing any pointers or areas we can still improve on or areas I completely missed. I appreciate any feedback you can give! Our site is faithology.com Thanks again! Brandon
Intermediate & Advanced SEO | | BMPIRE0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Does onsite content updates have an effect on SERPs?
Hi, Some might see this as a very (VERY) basic question but wanted to drill down into it anyway. Onsite content: Lets say you have a service website and attached to it is a blog, the blog gets updated every other day with 500 words of relevant content, containing anchor text links back to a relevant page on the main website. Forget about social signals and natural links being built from the quality content, will adding the content with anchor text links be more beneficial then using that content to generate links through guest blogging? 10 relevant articles onsite with anchor links, or 10 guest posts on other websites? I guess some might say 5 onsite and 5 guest posts.
Intermediate & Advanced SEO | | activitysuper0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0