Duplicate content across two websites
-
Hi. I'm looking at ways to compare duplicate content across two different websites instead of one, as with the Moz crawler. Instead it will flag up up duplicates present on both site A and B.
-
You can use Copyscape. You can also take a small piece of the copy and wrap in quotes and enter it into Google search. I would do the latter, before the former. Copyscape costs $.05 per page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Feedback on Content Ideation / "Skyscraper" Spreadsheet Template
Hi All - I've been getting a ton of use out of the MOZ API for discovering the popularity of content - which I'm using for content ideation or to implement the Skyscraper concept. I built a spreadsheet template that combines MOZ with some other APIs to apply this to new topics of my choosing, and my friends encouraged me to clean it up a bit and share with the broader community. So, here it is - fire away! I'd love any and all feedback about the spreadsheet - it's a prototype still so it could stand to pull back more results. For example: would you want to include Domain Authority in the results? Focus more or less on the social sharing elements - or let you choose the thresholds? Would love to know if there other methodologies for which you'd be interested in seeing spreadsheet templates produced. Cheers! skyscraper-template.png
Moz Pro | | paulkarayan0 -
How i can get the audit report (using tools etc.) immediately for any website?
I want to audit websites in few minutes/ hours (tools/ free tools, technics). Reports should be authentic and provide full site health report i.e.404 errors, duplicate page content/title, missing meta tags etc. Kindly suggest.
Moz Pro | | 1akal0 -
Error in Moz duplicate content reports
Hi - I've run the Moz campaign on a client's site. Moz is saying that there are duplicate content errors, and when I look at the errors it is showing that they are all to do with the non-www URLs having being duplicated in the www form of the URLs. However this is not the case - all the non-www URLs are all 301 redirected to the www URLs. Is this an error in the Moz tool? Has anybody experienced something similar?
Moz Pro | | rorynatkiel0 -
Duplicate Page content
I found these URLs in Issue: Duplicate Page Content | http://www.decoparty.fr/Products.asp?SubCatID=4612&CatID=139 1 0 10 1 http://www.decoparty.fr/Products.asp?SubCatID=4195&CatID=280 1 0 10 1 http://www.decoparty.fr/Catproducts.asp?CatID=124 | 28 | 0 | 12 | 1 |
Moz Pro | | partyrama0 -
Duplicate content analysis
Good morning everyone, I have just run a test from SEOmoz PRO tool and I got more than 2,000 double content errors. How might I see which are the 2 pages whose content is double? My website is just newly revamped and can't find these similarities on my own: for me there are not). Thanks for your help! Francesca
Moz Pro | | astojanov0 -
How can I prevent errors of duplicate page content generated by my tags from my wordpress on-site blog platform?
When I add meta data and a canonical reference to my blog tags for my on-site blog which works using a wordpress.org template, Roger generates errors of duplicate content. How can I avoid this problem? I want to use up to 5 tags per post, with the same canonical reference and each campaign scan generates errors/warnings for me!
Moz Pro | | ZoeAlexander0 -
Why does Crawl Diagnostics report this as duplicate content?
Hi guys, we've been addressing a duplicate content problem on our site over the past few weeks. Lately, we've implemented rel canonical tags in various parts of our ecommerce store, over time, and observing the effects by both tracking changes in SEOMoz and Websmater tools. Although our duplicate content errors are definitely decreasing, I can't help but wonder why some URLs are still being flagged with duplicate content by our SEOmoz crawler. Here's an example, taken directly from our Crawl Diagnostics Report: URL with 4 Duplicate Content errors:
Moz Pro | | yacpro13
/safety-lights.html Duplicate content URLs:
/safety-lights.html ?cat=78&price=-100
/safety-lights.html?cat=78&dir=desc&order=position /safety-lights.html?cat=78 /safety-lights.html?manufacturer=514 What I don't understand, is all of the URLS with URL parameters have a rel canonical tag pointing to the 'real' URL
/safety-lights.html So why is SEOMoz crawler still flagging this as duplicate content?0 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0