Are all duplicate contents bad?
-
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back.
CASE 1:
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content.How does Google view this? Did Google penalize us due to this reason?
CASE 2:
In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website.
How does Google view this? Did Google penalize us due to this reason?
Along with all the download sites, there are also software piracy & crack sites that have the duplicate content.
So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites?
Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content?
Confused Please help.
-
It is tricky. As Michael said it is important to get your content indexed first, which can help identify you as the source. Google doesn't always do a great job of that. Generally, I don't worry too much about Case 1, but in your case, it can be tougher. The problem is that many download sites can have very high authority and could start outranking you for these product descriptions. If that happens, it's unlikely you'd be penalized, but you could be filtered out or knocked down the rankings, which might feel like a penalty.
Here's the thing, with Case 1, though. If these download sites are simply outranking you, but you're distributing product, is it so awful? I think you have to look at the trade-off through the lens of your broader business goals.
Case 2 is tougher, since there's not a lot you can do about it, short of DMCA takedowns. You've got to hope Google sorts it out. Again, getting in front of it and getting your content in the index quickly is critical.
If you were hit by Panda, I'd take a hard look at anything on your own site that could be harming you. Are you spinning out variations of your own content? Are you creating potentially duplicate URLs? Are you indexing a ton of paginated content (internal searches, for example). You may find that the external duplicates are only part of your Panda problem - if you can clean up what you control, you'll be much better off. I have an extensive duplicate content write-up here:
-
For all new content it is important to get indexed fast. There is the scenario that if your site is crawled infrequently another site may get that copy indexed first and by default is viewed as theirs. So with any new content I would post on social media as quickly as possible - G+, Twitter etc to get noticed and to mark as yours. G+ author attribute will help.
-
Hi Gautam,
Good questions, it really hard to say what Google determines as duplicate content so this will just be my hunch on your issue. As I have experienced Google won't 'penalize' you as you're the owner of the content and you can't be the victim of other people stealing or copying your content. Also if you have provided these sites with your content. Mostly because you're often not in charge of the content management on somebodies elses site.
Hope this helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing some really bad sites that ranked in my niche years ago reaching 1st page
It started after the update about 4 websites form the 1st page dropped to the 2nd and 4 of the other sites just popped back to the 1st page and the bad part is that the Da and inbound links of these sites are really bad, so my question is must we just wait this out till Google realises how bad these site are and some of them haven't been updated in years links broken i can go on and on. what these sites have is just the age of the domains, but can this really be the main focus of these results?
Algorithm Updates | | johan80 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
What to do with old, outdated and light content on a blog?
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts. I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates. Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
Algorithm Updates | | simplycary0 -
Google webmaster tool content keywords Top URLs
GWT->Optimization->Content Keywords section... If we click on the keyword it will further shows the variants and Top URLs containing those variants. My problem is none of the important pages like product details pages, homepage or category pages are present in that Top URLs list. All the news, guides section url's are listed in the Top URLs section for most important keyword that is also present in my domain name. How to make google realize the important pages for the important keyword?
Algorithm Updates | | BipSum0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
How to build good content and choose right keywords.?
I have started building content for our website using the Wordpress tool. Now I wanted to know that I use GA and the Adwords keyword tool. I go in for exact matching keywords and have selected a few of them. How do I know if these keywords are actually the ones going to give me good traffic? How can I select good keywords and write content along them. I don't wish to over stuff articles with the keywords. How can I refrain from doing so. Any optimum limit through which I know how much of the keyword needs to occur how many times within an article? Please give some good insights as to how this is accomplished? Thanks
Algorithm Updates | | shanky11 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Question relates to mobile site & duplicate content.
We are working on the mobile version of a large site (migraine.com) and will be using a separate theme for it (directing visitors to m.migraine.com)- what are the necessary code or other important step we should take so that we do get penalized for having duplicate content? Thank you in advance for your responses
Algorithm Updates | | OlivierChateau0