How to fight against a site always "re-write" your content?
-
A site is always copying our content then re-write to their site, how to fight against this kind of action?
(Many of those copied content can get a rank very closely behind us, which will grab some of our visits )
I tried to find DMCA, but as they have changed some paragraphes, etc, DMCA can't punish them as copying.
I think many of you also have met such a problem, how will you handle this situation??
-
Try disallowing the user agent in your robots.txt file and pinning down the IP address of the scraper so you can deny it in your .htaccess file
-
Thanks for your reply.
I newly added the rel=canonical tag days ago, and I will spread our posts on social networks once we published it.
Hope it will works, Google wrongly gives them authority in my opinion.
-
Hi Jonny
If I were you, I would use a rel=canonical tag solution, so that every page (and new page) has a canonical tag on it.
Among other things, this will tell Google that your site is the originator of this content. Any other versions of it on your site or across the web is being used purely for user experience and therefore should not be ranked over the original.
As you will be publishing the content first, it should be crawled first by the search engines as well. To ensure that it is, I would also share your pages on social media when they go live, as it helps to index the pages much quicker.
This way, the site scraping your content should (in theory) not be able to rank for the content - or at the very least will be seen by Google as the copier of the content, while you will be seen as the originator, due to being indexed first with the canonical tag.
You can read more on canonicals with this handy Moz guide.
Hope this helps.
-
I tried to contact them via contact us form, but no reply.
Then I wanted to contact their host provider directly, but they are hidden behind proxy, obviously, they are just aiming at stealing contents from others.
-
My first piece of advice would be to appeal to their human side - state your case that it's your hard work they're stealing and that duplicate content may actually hurt both of your rankings.
If that doesn't work which I expect it wont, dend a strongly worded Cease and Desist letter, you'll find some templates here:
http://minnesotaattorney.com/cease-desist-letter-template-example-sample-forms/
Ensure all of your content has a clear copyright mark.
Use the letter to make the point that legal action may be taken if they do not stop.
good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Writing Service Recommendations
I am looking to hire a content writer for our sites. Anyone familiar with a service where the manage the content on your site? Basically, come up with topics & content ideas, then writing the content. Please give me an idea of the pricing if possible. Greatly appreciate any help.
Content Development | | inhouseseo0 -
How do I properly sitemap a site with static pages + Wordpress in it's own directory?
I apologize for the awkward wording in the headline. No to the issue, I have a site with static pages that are created as follows: url.com, url.com/page1, url.com/page2, etc. I then have WordPress install at url.com/blog. What is the proper method for creating a comprehensive sitemap for my entire domain. I like the sitemap feature provided by Yoast SEO plugin but I assume it will only index the wordpress directory (url.com/blog). Any help would be greatly appreciated!
Content Development | | Qcmny0 -
Can I share share my content with other sites either as a individual post or RSS or will I be encourage duplicate content on the web and upset Google
Hi, I have a new site http://www.homeforbusiness.co.uk. I want to encourage traffic to the site by sharing some of my content with other related websites which have a higher PR ranking and traffic for a link to my site. Is this going to upset Google re-duplicate content and devalue my site and stop any organic rankings in the future? Equally some high PR sites which have a good synergy with mine such as http://thewomensbusinessclubs.com/ allow me to add my RSS feed with their blog network. Is this a good thing to do or not for the same reasons as above? Or can I only do the above my creating fresh content? Thanks, Elizabeth Conley
Content Development | | econley0 -
How can I rank using translated content?
My friend has a website with similar content to mine, in a different language however. He has allowed me to translate his content if I link to it every post (can be nofollow). Does Google penalize me for clearly translated content? How can I make sure it ranks well? BTW, if I convince him that I don't link to him, is it better SEO-wise? Best,
Content Development | | kikocherman
Cherman0 -
How to best implement "metered model" on a site
Hi, I'm scratching my head over how to best implement the "metered model" on a site without users being able to game it all too easily. Has anybody in this QA forums implemented one before and is willing to share his/her best practises and findings? Currently I think raising the bar to force everybody to login is a bad idea + we would still need to open the site for google and other engines and can be tricked that way. Also this might lead to some penalty (cloaking)? Using cookies might not be enought as I think almost every Internet user these days knows that this might be the #1 place to look and they are deleted in a second. Counting based on a users IP-adress is also a bit critical as this is not accurate enough. Should we just use cookies and hope for the best?
Content Development | | jmueller0 -
Duplicate Content Penalty
If our pages are to have roughly 30% of non-original textual content, can we be penalized by Google? Or are we OK as long as this non-original content is relevant to the pages?
Content Development | | Quidsi0 -
Can un-unique content damage my rankings?
Hi there, I run a blog @ http://ablemagazine.co.uk We produce our own editorial content for our print magazine. Which means I have a great bank of uniquely written content. I can usually afford to post 1-2 completely 100% unique articles a day. I've also been copy/pasting 2-3 articles from the BBC or The Guardian a day to keep up activity. Should I continue doing what I'm doing? Should I post exclusively unique articles? Thanks
Content Development | | craven220 -
Index pdf files but redirecto to site
Hi, One of our clients has tons of PDFs (manuals, etc.) and frequently gets good rankings for the direct PDF link. While we're happy about the PDFs attracting users' attention, we'd like to redirect them to the site where the original PDF link is published and avoid that people open the pdf directly. In short, we'd like to index the PDFs, but show to users the pdf link within a site - how should we proceed to do that? Thanks, GM
Content Development | | gmellak0