What are some of the negative effects of having duplicate content from other sites?
-
This could include republishing several articles from another site with permission.
-
I republish article content from other websites. This is always done with permission and often the other sites ask me to do it because they have a message that they want to place in front of my visitors. I value this content because my visitors consume it.
Unfortunately these pages do not perform well in search even though I optimize them so that they do not directly compete for the same keywords as the origins. Why? There are usually several other sites with the same content. And they compete with each other in the SERPs and share the search traffic.
I incur labor costs in posting the content and acquiring nice images to make it look good on my site. Sometimes these are not fully recovered because of their reduced traffic. Also, these pages have a linkjuice flow into them that might be more valuable if sent to other pages.
I view these as costs to serve my visitors and the industry niche of my site.
One other risk is the duplicate content. Perhaps updates like the Panda update would reduce rankings of sites that contain lots of duplicate content. I have not seen that yet in my niche. The quality of the content is some of the highest on the web and the sites that republish it are almost all authoritative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Duplicate Content Mystery
Hi Moz community! I have an ongoing duplicate mystery going on here and I'm hoping someone here can answer my question. We have an Ecommerce site that has a variety of product pages and category pages. There are Rel canonicals in place, along with parameters in GWT, and there are also URL rewrites. Here are some scenarios, maybe you can give insight as to what’s exactly going on and how to fix it. All the duplicates look to be coming from category pages specifically. For example:
Technical SEO | | Ecom-Team-Access
This link re-writes: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html?cat=407&color=152&price=20- To: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html The rel canonical tag looks like this: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html" /> The CONTENT is different, but the URLs are the same. It thinks that the product category view is the same as the all products view, even though there is a canonical in there telling it which one is the original. Some of them don’t have anything to do with each other. Take a look: Link identified as duplicate: http://www.incipio.com/cases/smartphone-cases/htc-smartphone-cases/htc-windows-phone-8x-cases.html?color=27&price=20- Link this is a duplicate of: http://www.incipio.com/cases/macbook-cases/macbook-pro-13in-cases.html Any idea as to what could be happening here?0 -
Fullsite=true coming up as duplicate content?
Hello, I am new to the fullsite=true method of mobile site to desktop site, and have recently found that about 50 of the instances in which I added fullsite=true to links from our blog show as a duplicate to the page that it is pointing to? Could someone tell me why this would be? Do I need to add some sort of rel=canonical to the main page (non-fullsite=true) or how should I approach this? Thanks in advance for your help! L
Technical SEO | | lfrazer0 -
Headers & Footers Count As Duplicate Content
I've read a lot of information about duplicate content across web pages and was interested in finding out about how that affected the header and footer of a website. A lot of my pages have a good amount of content, but there are some shorter articles on my website. Since my website has a header, footer, and sidebar that are static, could that hurt my ranking? My only concern is that sometimes there's more content in the header/footer/sidebar than the article itself since I have an extensive amount of navigation. Is there a way to define to Google what the header and footer is so that they don't consider it to be duplicate content?
Technical SEO | | CyberAlien0 -
Is this duplicate content when there is a link back to the original content?
Hello, My question is: Is it duplicate content when there is a link back to the original content? For example, here is the original page: http://www.saugstrup.org/en-ny-content-marketing-case-infografik/. But that same content can be found here: http://www.kommunikationsforum.dk/anders-saugstrup/blog/en-ny-content-marketing-case-til-dig, but there is a link back to the original content. Is it still duplicate content? Thanks in advance.
Technical SEO | | JoLindahl912 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0