What is the best way to resolve duplicate content issue
-
Hi
I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere.
My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content?
I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply.
I've also considered rewriting our content. Again this takes time.
Has anybody experienced this issue before? If so how did you come to a solution?
Thanks in advance.
-
No worries Alex
I mean, contacting the webmasters would technically be simpler, but the chances that you're going to get a response, never mind a take-down of your content, is going to be pretty slim. Hence I suggested the rewriting.
It's a pain in the arse and requires you to do more work because of someone's laziness, which if course isn't right. But hopefully, with the fresh content and the tags in place, you'll be given the full credit.
In addition, if any of the content come in the form of blog posts, or if you'd like to do this site-wide, implementing a rel=author tag and verifying Google authorship would again be a signal to Google that your content is original. Here are a couple of handy guides to help with the markup:
http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
http://www.vervesearch.com/blog/seo/how-to-implement-the-relauthor-tag-a-step-by-step-guide/
-
Hi Tom
That's a great help.
I just wanted to ensure there wasn't a simpler solution besides rewriting the content. I guess that is the easiest and will ensure canonical tag solution is implemented too.
Thanks.
-
Hi Alex
I think the best solution here and the one that you can control the most is to rewrite the content and then ensure that your new content is seen as the originator.
Rewriting the content will take time, but obviously ensures that the content is unique, removing the duplicate content issue.
If I were you, I would then use a rel=canonical tag solution, so that every page (and new page) has a canonical tag on it.
Among other things, this will tell Google that your site is the originator of this content. Any other versions of it on your site or across the web is being used purely for user experience and therefore should not be ranked over the original.
As you will be publishing the content first, it should be crawled first by the search engines as well. To ensure that it is, I would also share your pages on social media when they go live, as it helps to index the pages much quicker.
This way, the site scraping your content should (in theory) not be able to rank for the content - or at the very least will be seen by Google as the copier of the content, while you will be seen as the originator, due to being indexed first with the canonical tag.
You can read more on canonicals with this handy Moz guide.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content? other issues? using vendor info when selling their prodcuts?
When building content for vendors that we sell their products? best practices? ok, to copy and paste "about us" info? or will that be flagged as duplicate content.
On-Page Optimization | | bakergraphix_yahoo.com0 -
Hi i have a few pages with duplicate content but we've added canonical urls to them, but i need help understanding what going on
hi google is seeing many of our pages and dupliates but they have canonical url on there https://www.hijabgem.com/index.php/maxi-shirt-dress.html has tags https://www.hijabgem.com/maxi-shirt-dress.html
On-Page Optimization | | hijabgem
has tagshttps://www.hijabgem.com/index.php/quickview/index/view/id/4693
has tags
my question is which page takes authority?and are they setup correct, can you have more than one link rel="canonical" on one page?0 -
Best way to get the fastest WordPress site with existing template
I have a WordPress website that I started building for free and eventually paid the $50 bucks for the premium version. My goal right now is to increase the speed. The current speed rating for the site on Google Insights is 60/100 for mobile and 66/100 for desktop. I have used some plugins to help me increase the speed, but the previous number were the best I could do. How can I get this site at least into the 90/100 range? Should I buy another template, or can I hire someone to go through and optimize it. I would like the best option that is as cost effective as possible as this is a hobby site for me. Thanks!
On-Page Optimization | | FierceFrame1 -
Duplicate content because of member only restrictions on a forum.
Our website's Community Forum links to the membership profile pages, which by default are blocked for non-members. https://www.foodbloggerpro.com/community/ https://www.foodbloggerpro.com/community/member/1301/ We're getting warnings in Moz for duplicate content (and errors) on these member profile pages. Any ideas for how we can creatively solve this problem? Should we redirect those pages or just beef them up with more content? Just ignore it and assume that search spiders will be smart enough to figure it out? See attached video for further explanation. Community_Area.mp4
On-Page Optimization | | Bjork0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
The Best Way to Market a Blog Post?
So . . . I've written a beautiful blog (or video or podcast or whatever . . . just take my word for it that it will be nominated for an Oscar, Webby, or Pulitzer very soon). What's the best way to get the word out about it? Let me rephrase that. I know I should tweet, post on Facebook, social networks etc cetera. My real question is, should I link to the home page (which is where my blog is located . . . it would show up there) or should I link to the specific page of the full blog post? Seems to me, linking to the blog post directly gives me a better chance at tracking the success of the article in generating interest and traffic, but I was assuming linking to the home page helped pump up the importance of my overall site? I still consider myself an SEO Noob so be sure to speak slowly and not use big words. Consider drawing pictures.
On-Page Optimization | | damon12120 -
Seo-friendly way to post blog content to homepage?
We have an e-commerce website. We would like to feature content from our blog on the site's homepage (both on same domain). The content we want to feature is latest posts titles (say 5) plus the few first lines from each post. We want search engines to be able to read this content. Is there a SEO friendly way to achieve this? Thanks in advance.
On-Page Optimization | | gerardoH0 -
Cross Domain Duplicate Content
Hi My client has a series of websies, one main website and several mini websites, articles are created and published daily and weekly, one will go on a the main website and the others on one, two, or three of the mini sites. To combat duplication, i only ever allow one article to be indexed (apply noindex to articles that i don't wanted indexed by google, so, if 3 sites have same article, 2 sites will have noindex tag added to head). I am not completely sure if this is ok, and whether there are any negative affects, apart from the articles tagged as noindex not being indexed. Are there any obvious issues? I am aware of the canonical link rel tag, and know that this can be used on the same domain, but can it be used cross domain, in place of the noindex tag? If so, is it exactly the same in structure as the 'same domain' canonical link rel tag? Thanks Matt
On-Page Optimization | | mattys0