What is the best way to resolve duplicate content issue
-
Hi
I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere.
My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content?
I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply.
I've also considered rewriting our content. Again this takes time.
Has anybody experienced this issue before? If so how did you come to a solution?
Thanks in advance.
-
No worries Alex
I mean, contacting the webmasters would technically be simpler, but the chances that you're going to get a response, never mind a take-down of your content, is going to be pretty slim. Hence I suggested the rewriting.
It's a pain in the arse and requires you to do more work because of someone's laziness, which if course isn't right. But hopefully, with the fresh content and the tags in place, you'll be given the full credit.
In addition, if any of the content come in the form of blog posts, or if you'd like to do this site-wide, implementing a rel=author tag and verifying Google authorship would again be a signal to Google that your content is original. Here are a couple of handy guides to help with the markup:
http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
http://www.vervesearch.com/blog/seo/how-to-implement-the-relauthor-tag-a-step-by-step-guide/
-
Hi Tom
That's a great help.
I just wanted to ensure there wasn't a simpler solution besides rewriting the content. I guess that is the easiest and will ensure canonical tag solution is implemented too.
Thanks.
-
Hi Alex
I think the best solution here and the one that you can control the most is to rewrite the content and then ensure that your new content is seen as the originator.
Rewriting the content will take time, but obviously ensures that the content is unique, removing the duplicate content issue.
If I were you, I would then use a rel=canonical tag solution, so that every page (and new page) has a canonical tag on it.
Among other things, this will tell Google that your site is the originator of this content. Any other versions of it on your site or across the web is being used purely for user experience and therefore should not be ranked over the original.
As you will be publishing the content first, it should be crawled first by the search engines as well. To ensure that it is, I would also share your pages on social media when they go live, as it helps to index the pages much quicker.
This way, the site scraping your content should (in theory) not be able to rank for the content - or at the very least will be seen by Google as the copier of the content, while you will be seen as the originator, due to being indexed first with the canonical tag.
You can read more on canonicals with this handy Moz guide.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento - How to avoid duplicate content on products that span different sites.
We have 4 Magento store fronts that operate out of the same backend. Is there any way to safely have products that span multiple stores without getting a duplicate content penalty? thanks!
On-Page Optimization | | Shop-Sq0 -
Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
On-Page Optimization | | sparrowdog0 -
Duplicate Content aka 301 redirect from .com to .com/index.html
Moz reports are telling me that I have duplicate content on the home page because .com and .com/index.html are being seen as two pages. I have implemented 301 redirect using various codes I found online, but nothing seems to work. Currently I'm using this code. RewriteEngine On
On-Page Optimization | | omakad
RewriteBase /
RewriteCond %{HTTP_HOST} ^jacksonvilleacservice.com
RewriteRule ^index.html$ http://www.jacksonvilleacservice.com/ [L,R=301] Nothing is changing. What am I doing wrong? I have given it several weeks but report stays the same. Also according to webmasters tools they can't see this as duplicate content. What am I doing wrong?0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
How to avoid duplicates when URL and content changes during the course of a day?
I'm currently facing the following challenge: Newspaper industry: the content and title of some (featured) articles change a couple of times during a normal day. The CMS is setup so each article can be found by only using it's specific id (eg. domain.tld/123). A normal article looks like this: domain.tld/some-path/sub-path/i-am-the-topic,123 Now the article gets changed and with it the topic. It looks like this now: domain.tld/some-path/sub-path/i-am-the-new-topic,123 I can not tell the writers that they can not change the article as they wish any more. I could implement canonicals pointing to the short url (domain.tld/123). I could try to change the URL's to something like domain.tld/some-path/sub-path/123. Then we would lose keywords in URL (which afaik is not that important as a ranking factor; rather as a CTR factor). If anyone has experiences sharing them would be greatly appreciated. Thanks, Jan
On-Page Optimization | | jmueller0 -
User experience regarding dulpicate content and managing this content with google.
Hi long title i know! We are moving on to magento and have chosen to allocate a specific colour to each category using corresponding tabbed navigation for user experience.All products within each of the coloured tabs then inherit the repective colour, giving the products a category identiy within the store. This layout has had a positive feedback from our "testers" As a lot of our products are seasonal and can be represented in different categories there is a significant amount of duplicate content. ATM i see our options as being: Alter the site structure so that the category is not shown in the url, therefore eliminating our duplicate products. The downside of this is that the colour co-ordination of the categories would not work at product level as its the category path that assigns the colour. create canonical links for every duplicate, can this be damaging? keep the duplicates and do nothing let google decide the most important version of a product. any guidance would be appreciated!
On-Page Optimization | | LadyApollo0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0 -
Duplicate content on my domain
I have several pages on my domain that are using the same content except for changing a paragraph or sentence. Do I need to create unique content, even though much of the information pertains to a feature and is related?
On-Page Optimization | | Court_H0