How to check duplicate content with other website?
-
Hello,
I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them?
Thanks,
-
If you want to check who "copied" your content you can use - as told by the others - Copyscape.
Or, you can use Google itself.
Pro Tip:
- set search in order to show you 100 search result per time;
- tell Google to show you also the results it may have filtered out for being "substantially identical" to the ones it is showing you already;
- use the scraper extension for Chrome and scrape the Google results and export them in Google Docs, so to start analyzing the site that are scraping your content
- if the content you write is copyrighted, you can ask Google to deindex the site scraping it in order to defend your rights as the original Author.
-
if Google thinks that your content si a copy from another site, the page copied will be penalised by the Panda algorithm.
Sorry to disagree with you: if "copied" content was a problem, then we will have sites like Techmeme out of the index.
The problem with with publishing syndicated content is not the act of republishing it, but the value you add or not while republishing the content of another site. For instance, if you add classic content curation practice, as commenting inline or before or after the "copied" content, or if you published it and open a discussion that generates UGC content, then that copied content is not a problem.
Be aware, I am talking of content republished with the permission of the original author/publisher of the content itself.
Other thing is scraped content, which don't add value. In that case the scrapers seriously are at risk of Panda or, simply, of being filtered out of the visible index.
Similarly duplicated content can be a risk when it comes to products description in an eCommerce or Classified site. That content - again - seriously can lead you to a Panda penalization. That's why it is always better to rewrite the standard products description, or add more unique content that may add value, as "the site review" of the product, users' reviews, etc etc.
-
Hi,
I will have to disagree with Natan - duplicate content is not really such a big deal as a lot of people are advertising it for.
There is no such thing as duplicate content penalty and de-indexation of a site based on duplicate content - it was never the case and it will never be the case.
I am not saying you don't have to deal with it - you do - you should - but only when appropriate.
As far as Panda is concerned, it is a ranking or you can even call it a filter - but not a penalty and it is only based on market and competition. Yes, with low authority and a strong competition providing more or less the same information you can get under this Panda filter but it's way more then that - it's not 1 and 0 - black and white with it.
To see how "unique" your content is and where on the web other sites holds the same or parts of your content you can use copyscape - as Natan mention - but for the rest, sorry Nate, the advice is just not right.
Cheers.
-
Hello,
Duplicate content is a key factor in SEO, if Google thinks that your content si a copy from another site, the page copied will be penalised by the Panda algorithm.
If someone copies your content and is indexed earlier than you, then, your page will rank lower than your thief.
To prevent that, you must share the content immediately on Google Plus, and other SocialMedia and social bookmarks.
If Google thinks that all of your content is a copy, not only a page, but your entire site could suffer a penalty, or even a un-indexation.
if you think that your articles are being stolen or that you bought articles and the redactor is giving you copies from somewhere, you can chek that with copyscape.com
I hope to be usefull and easy to understand!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate content: Form labels and field content
I have a site that has 500 pages, each with unique content, the only content that could be deemed the same is the 'Make Contact' form, which has the same labels and placeholder text on each page. Is this likely to cause any duplicate content penalties?
On-Page Optimization | | deployseo0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
How to avoid product's lists from making your site's content duplicated?
Hi there! We at Outitude, recently launched an outdoor activities marketplace and to make it easy for users to compare activities we show a list of available activities in each activity view. The problem is that though the content is different, the first half is practically identical. Example:
On-Page Optimization | | alexmc
Sailing for a full day: http://outitude.com/en/sailing/world/sailing-full-day and sailing for half a day: http://outitude.com/en/sailing/world/sailing-half-day both URL's are different, their content is different but most of it is not (first half of the page), so that the user can compare the activity it is currently seing with others. Questions: How can we show the activities list without it ruining the page rank? Do you advise the use of "", "" surrounding the duplicated content aka activities lists? Thanks in advance.0 -
Duplicate content on my domain
I have several pages on my domain that are using the same content except for changing a paragraph or sentence. Do I need to create unique content, even though much of the information pertains to a feature and is related?
On-Page Optimization | | Court_H0