Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Plagiarism or duplicate checker tool?
-
Do you know a plagiarism or duplicate checker tool where I can receive an email alert if someone copies my content? I know there's a tool like this (similar to http://www.tynt.com/ though people can still remove the link from the original source) but I forgot the name or site. It's like a source code that you must insert in each of your webpage.
Thanks in advanced!
-
[Spammy comment removed by forum moderator.]
-
Sorry dude but these are not what I'm looking for. Thanks anyway.
-
In order to protect your interest, there are various tools available on the internet. They may be free or paid; you can invest in them as per your requirement, but this is something that is certainly going to be useful for your website content. Check the list below.
1. CopyScape
2. Plagium
3. PaperRater
7. Grammarly
10. PlagTracker
-
Hey.
Maybe it would be an idea to use Copyscape.com. They have the so called Copysentry.
Gr.,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
Duplicate page titles and hreflang tags
Moz is flagging a lot of pages on our site which have duplicate page titles. 99% of these are international pages which hreflang tags in the sitemap. Do I need to worry about this? I assumed that it wasn't an issue given the use of hreflang. And if that's the case, why is Moz flagging them as an issue? Thanks.
On-Page Optimization | | ahyde0 -
How to Handle duplicate pages/titles in Wordpress
The wordpress blog causes problems with page titles. If you go to the second page of blog posts it there's a different URL but with the same page title. for example: page 1: site/blog page 2: site/blog/page/2 Each page gets flagged for duplicate page titles. Thanks in advance for your thoughts,
On-Page Optimization | | heymarshall1 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Duplicate eCommerce Product Descriptions
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
On-Page Optimization | | mj7750 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | | Shipyard_Agency0 -
Page speed tools
Working on reducing page load time, since that is one of the ranking factors that Google uses. I've been using Page Speed FireFox plugin (requires FireBug), which is free. Pretty happy with it but wondering if others have pointers to good tools for this task. Thanks...
On-Page Optimization | | scanlin0