Are all duplicate contents bad?
-
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back.
CASE 1:
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content.How does Google view this? Did Google penalize us due to this reason?
CASE 2:
In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website.
How does Google view this? Did Google penalize us due to this reason?
Along with all the download sites, there are also software piracy & crack sites that have the duplicate content.
So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites?
Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content?
Confused Please help.
-
It is tricky. As Michael said it is important to get your content indexed first, which can help identify you as the source. Google doesn't always do a great job of that. Generally, I don't worry too much about Case 1, but in your case, it can be tougher. The problem is that many download sites can have very high authority and could start outranking you for these product descriptions. If that happens, it's unlikely you'd be penalized, but you could be filtered out or knocked down the rankings, which might feel like a penalty.
Here's the thing, with Case 1, though. If these download sites are simply outranking you, but you're distributing product, is it so awful? I think you have to look at the trade-off through the lens of your broader business goals.
Case 2 is tougher, since there's not a lot you can do about it, short of DMCA takedowns. You've got to hope Google sorts it out. Again, getting in front of it and getting your content in the index quickly is critical.
If you were hit by Panda, I'd take a hard look at anything on your own site that could be harming you. Are you spinning out variations of your own content? Are you creating potentially duplicate URLs? Are you indexing a ton of paginated content (internal searches, for example). You may find that the external duplicates are only part of your Panda problem - if you can clean up what you control, you'll be much better off. I have an extensive duplicate content write-up here:
-
For all new content it is important to get indexed fast. There is the scenario that if your site is crawled infrequently another site may get that copy indexed first and by default is viewed as theirs. So with any new content I would post on social media as quickly as possible - G+, Twitter etc to get noticed and to mark as yours. G+ author attribute will help.
-
Hi Gautam,
Good questions, it really hard to say what Google determines as duplicate content so this will just be my hunch on your issue. As I have experienced Google won't 'penalize' you as you're the owner of the content and you can't be the victim of other people stealing or copying your content. Also if you have provided these sites with your content. Mostly because you're often not in charge of the content management on somebodies elses site.
Hope this helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
Content strategy for landing pages: Topics vs Features
Hi all, We are going to create new landing pages and optimise existing pages. We have a confusion on how to employ content on these pages....whether these will be filled with content to rank for "topics" and "keywords" or direclty jump into the features are are providing. If we go with first, users may feel boring about teaching them about that topic, if we go with latter...it's hard to rank being no related content to rank for that topic. I have seen some of the websites are employing multiple landing pages where they fill with topic related content and then link to features pages. I need suggestions here. Thank you
Algorithm Updates | | vtmoz1 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Shared Hosting - Bad for SEO? (exp. Godaddy)
There were a lot of questions and data on this a few years back and nothing terribly recent so i wanted to get the discussion going again and see if any new data has been published. Is hosting your website on a shared host like Godaddy or Network Solutions going to hurt your rankings because their holds a chance that you could be on the same IP as spammy websites? My gut feeling is no primarily because almost 90% of the worldwide web is on shared hosting but i do not have a lot of data to back it up. Id love to hear some feedback. Cheers - Kyle
Algorithm Updates | | kchandler0 -
Ecommerce good/bad? Showing product description on sub/category page?
Hi Mozers, I have a ecommerce furniture website, and I have been wondering for some time if showing the product descriptions on the sub/category page helps the website. If there is more content displayed on the subcategory, it should be more relevant, right? OR does it not matter, as it is duplicate content from the product page. I think showing the product descriptions on non-product pages is hurting my design/flow, but i worry that if I am to hide product content on sub/category pages my traffic will be hurt. Despite my searches I have not found an answer yet. Please take a look at my site and share your thoughts: http://www.ecustomfinishes.com/ Chris 27eVz
Algorithm Updates | | longdenc_gmail.com0 -
Duplicate videos
We have multiple screen cast videos being made that are the same all except for city names being switched out. Does anyone know if this might be frowned upon by Google in the same way that duplicate content is?
Algorithm Updates | | Stevej240