Duplicate Content Indentification Tools
-
Does anyone have a recommendation for a good tool that can identify which elements on a page are duplicated content? I use Moz Analytics to determine which pages have the duplicated content on them, but it doesn't say which pieces of text or on-page elements are in fact considered to be duplicate.
Thanks Moz Community in advance!
-
Thank you. These steps are a part of our process.
-
Here is some guidelines from Google Webmasters Help on Duplicate Content with tips to resolve issues.
-
Yes. I also agree that CopyScape is better for plagiarism. I am also reviewing the canonical tags we have in place for these pages. I am trying to view the marked pages from a few different angles to gain a fuller understanding of why indeed they are being marked with 'duplicate content' warnings on our analytics platform and for a deeper understanding of the situation so to create a process of checks for any future warnings.
-
I use CopyScape but it's more of a plagiarism tool then an actual duplicate content identifier tool. I say that because just because a few lines of text are the same on a page, that doesn't mean Google will remove it from the SERPs. Generally duplicate content has to be a substantial portion of a webpage to be considered duplicate content.
I would first dig into Moz Analytics and see WHY you are generating duplicate content before I would worry about what part of the page is duplicate.
- Have you set canonicals on your pages?
- Does your site produce session IDs?
- Do you have pagination?
- Are you copying and pasting text from page to page to fill up your site?
Google has said time and time again, duplicate content issues are rarely a penalty. It is more about Google knowing which page they should rank and which page they should not. Take a look at why you are getting the duplicate content issue and then we can help you resolve it or give advice on what to do next.
-
Copyscape.com will tell you if you have duplicate content. If you have a big site with loads of pages I'd buy credits or you'll have difficulty because it only lets you check a few pages per day (I can't remember what the limit is). With the paid version you can upload your xml sitemap (s) and it'll check all the pages in that file. Then the report will highlight the bits of copy that is duplicate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content query
I'm currently reauthoring all of the product pages on our site. Within the redesign of all the pages is a set of "why choose us?" bullet points and "what our customers say" bullet points. On every page these bullet points are the same. We currently have 18% duplicate content sitewide and I'm reluctant to push this. The products are similar but targeted at different professions, so I'm not sure whether to alter the text slightly for the bullet points on each page, remove the bullet points entirely or implement some form of canonicalisation that won't impact the profession-specific pages' ability to rank well.
On-Page Optimization | | EdLongley0 -
Long list of companies spread out over several pages - duplicate content?
Hi all, I am currently working with a company formation agent. They have a list of every limited company spread over hundreds of pages. What do you guys think? Is there a need for Canonicals? The website is ranking pretty well but I want to make sure there aren't any problems in the future. Here are two pages as examples: http://www.formationsdirect.com/companysearchlist.aspx?start=MULLAGHBOY+CONSTRUCTION+LIMITED&next=1# http://www.formationsdirect.com/companysearchlist.aspx?start=%40a+company+limited&next=1# Also what about the actual company pages? See an example below http://www.formationsdirect.com/companysearchlist.aspx?name=AMNA+CONSTRUCTION+LTD&number=06630333#.U8PW6_ldX1s Thanks in advance Aaron
On-Page Optimization | | AaronGro0 -
Duplicate content shown in Google webmaster tools for 301 redirected URLs.
Why does Google webmaster tools shows 5 URLs that have been 301 redirected as having duplicate meta descriptions?
On-Page Optimization | | Madlena0 -
Duplicate Content only an Issue on a Huge Scale?
To what extent is duplicate content an issue? We have a support forum with some duplicate content because users ask the same questions. The Moz reports we receive highlights our duplicate content and page title for our support forum as a "big" issue. I'm unsure to what extent it harms our SEO, and making the support section non-crawable would impair our level of support. It would be nice to know for sure if we should be concerned about this, and if yes, how can we do it differently? Thanks, I appreciate you help. -Allan
On-Page Optimization | | Todoist0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Does 301 generate organic content ?
I manage this domain name www.jordanhundley.com . Right now it is 301 to www.jordanhundley.net where I hosted the content for almost 18 months. At this point you are only able to read the 301 script if you use CTRL U at the .com domain. Does Google read the content beyond the script? Is the 301 website getting juice from the targeted domain ? This is the script I´m using <html> <head> <title>Jordan Hundleytitle> head> <frameset rows="100%,*" border="0"> <frame src="[http://www.jordanhundley.net](view-source:http://www.jordanhundley.net/)" frameborder="0" /> frameset><noframes>noframes> html>
On-Page Optimization | | mPloria0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0