Prevent average users from copying content and pasting into their websites
-
Please do not respond with a "you can't stop them" comment, I understand this.
Most of our pages have content that is duplicated across multiple domains. The recent Google algorithm update focused on penalizing pages that have duplicate content, and it could be one of the reasons that we have been seeing traffic loss.
I'm looking for some type of javascsript/php code that will help minimize this issue.If could be code that does not allow you to copy and paste the code without turning of javascript or a dialog box pops up and says "this content is copyright protected, anyone copying this content is subject to legal action"
I've found one script that might work http://www.ioncube.com/html_encoder.php
My questions are still the same:
1 What is the best method to achieve my objective?
2 Will this additional code affect how the webbots see our site and or affect rankings?
I know that anyone can figure out how to get the code, I am trying to mitigate by providing a warning about copyright infringement and making it more challenging to copy our content.
Please do not respond with a "you can't stop them" comment, etc, I understand this.
Thank you for your comments!
-
Another thing I've seen people use is Tynt at http://www.tynt.com/. If you copy and paste something, it automatically adds a "see more from" and the URL to the end of the copy text that shows up when you paste it.
A higher-level question is regarding HOW people are copying your content. Are they going to you website and copying and pasting it, or are they grabbing it from your RSS feed? If they're scraping it via RSS or a non-manual means, doing things with right-clicks won't help much.
-
I agree with Zachary. The script is good but there is a workaround. Post first. I would also suggest adding some links to interior pages, and to add some branding keywords in your copy.
Scrappers are lazy enough they forget to read the whole article.
-
If someone wants the content, they will get it. You can't stop them without making your content uncrawlable or severely limiting the user experience.
As long as you are publishing the article first, I don't see a problem with the situation. Duplicated content is a problem, sure, but for those that duplicate it. Unless Google has a problem finding the original source, you're fine. This has happened in the past but is rare.
Have you noticed a measurable difference in a controlled study?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google won't index my website because "certain conditions" weren't met
I found the answer on this -- interestingly, I had changed registrars and they didn't pull over the DNS information correctly. This caused the above issues. Once I identified this, I updated the DNS correctly -- at registrar and server -- and things worked fine.
Content Development | | newbyguy0 -
Any recommendations on a Content Marketing Firm?
I have used several and had some good results. Am wondering if I can find one that is more cost effective than who I am using now, as they just doubled their prices.
Content Development | | RoxBrock0 -
Duplicate Blog Content
Hey Mozzers! I have a client who is a dentist, with multiple offices, and wants to use an identical blog post (including images, alt text, text, tags, everything pretty much) from one of his office's website on his other office's website to save time and mimic the success of the original content for his other office. Everything I've researched says this is a HUGE no-no, but I'd love to hear if anyone else has tried to do something like this and if they were successful in doing so (implementing rel=cannonical or 301?). Also, if he is the owner of both sites and they both receive low traffic will Google even notice? My biggest worry is that if I did post the content on his other site, identically, that it would dilute the visibility of the original post, which has and is continuing to surpass our organic search goals... The main goal though, is to drive traffic to BOTH sites via organic search using the same content. Would love to hear everyone's opinions if this is possible or unrealistic... Thanks! -D
Content Development | | Derrald0 -
Are press releases that could end up being published with duplicate content links point back to you bad for your site ?
With all the changes to the seo landscape in the resent years im a little unsure as to how a press release work looks in the eyes of Google (and others). For instance, you write up a 500 word press release and it gets featured on the following sites : Forbes Techcrunch BBC CNN NY Times etc ... If each of these cover your story but only rewrite 50% of the article (not saying these sites wouldn't re write the entire artcile, but for this purpose lets presume only 50% is rewritten) could it be negative to your backlink profile, ? Im thinking not, as these sites will have high authority, but what if once your press release is published on these sites 10 other smaller sites re publish the stories with almost no re writing, either straight from the press release or straight from the article in the mainstream news sites. (For clarification this Press release would be done in the fashion of a article suggestion to relevant journalists, rather than a blanket press release, via PR Newswire, mass mail out etc. Although i guess the effect with duplicate content backlinks is the same.) You now have c. 50 articles online all with very similar content with links pointing back at you, would this have a negative effect or would each link just not carry as much value as it normally would. By now we all understand publishing duplicate content on our own sites is a terrible idea, but dose have links pointing back to your self from duplicate (or similar) content hosted on other sites (some being highly authoritative) effect your site 's seo ?
Content Development | | Sam-P1 -
Where to add content
Hello, In looking at GA for a client, his top 100 landing pages are all category pages with only a slight amount of articles and product pages. We haven't added content to the product pages, we just rewrote descriptions for unique content. They are about 100-200 words per product. Does that mean we should focus on adding content to category pages first? We're thinking of totaling 500 words or so (though less sometimes) of quality content to category pages. Your recommendations?
Content Development | | BobGW0 -
Integration of content on other sites
(by Google Traductor) Hello, we are able to make agreements with sites of good quality and reputation to integrate our classified ads for agropeuariosector on the websites of these companies. We fear that thesesites begin to index all this content and begin to compete with us in organic positioning. On the other hand this would generateduplicate content? That strategy will be applied in order to do so. Greetings and thanks!
Content Development | | romaro
Robert0 -
Content Writers / Blog Posts
Hi there Would anyone know where i could fund affordable, reliable blog post writers who would be able to produce quality posts at affordable rates? What would the accepted rate be etc? Regards Stef
Content Development | | stefanok0 -
Displaying archive content articles in a writers bio page
My site has writers, and each has their own profile page (accessible when you click their name inside an article). We set up the code in a way that the bios, in addition to the actual writer photo/bio, would dynamically generate links to each article he/she produces. Figured that someone reading something by Bob Smith, might want to read other stuff by him. Which was fine, initially. Fast forward, and some of these writers have 3,4, even 15 pages of archives, as the archive system paginates every 10 articles (so www.example.com/bob-smith/archive-page3, etc) My thinking is that this is a bad thing. The articles are likely already found elsewhere in the site (under the content landing page it was written for, for example) and I visualize spiders getting sucked into these archive black holes, never to return. I also assume that it is just more internal mass linking (yech) and probably doesnt help the overall TOS/bounce/exit, etc. Thoughts?
Content Development | | EricPacifico0