What is the best way to resolve duplicate content issue
-
Hi
I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere.
My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content?
I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply.
I've also considered rewriting our content. Again this takes time.
Has anybody experienced this issue before? If so how did you come to a solution?
Thanks in advance.
-
No worries Alex
I mean, contacting the webmasters would technically be simpler, but the chances that you're going to get a response, never mind a take-down of your content, is going to be pretty slim. Hence I suggested the rewriting.
It's a pain in the arse and requires you to do more work because of someone's laziness, which if course isn't right. But hopefully, with the fresh content and the tags in place, you'll be given the full credit.
In addition, if any of the content come in the form of blog posts, or if you'd like to do this site-wide, implementing a rel=author tag and verifying Google authorship would again be a signal to Google that your content is original. Here are a couple of handy guides to help with the markup:
http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
http://www.vervesearch.com/blog/seo/how-to-implement-the-relauthor-tag-a-step-by-step-guide/
-
Hi Tom
That's a great help.
I just wanted to ensure there wasn't a simpler solution besides rewriting the content. I guess that is the easiest and will ensure canonical tag solution is implemented too.
Thanks.
-
Hi Alex
I think the best solution here and the one that you can control the most is to rewrite the content and then ensure that your new content is seen as the originator.
Rewriting the content will take time, but obviously ensures that the content is unique, removing the duplicate content issue.
If I were you, I would then use a rel=canonical tag solution, so that every page (and new page) has a canonical tag on it.
Among other things, this will tell Google that your site is the originator of this content. Any other versions of it on your site or across the web is being used purely for user experience and therefore should not be ranked over the original.
As you will be publishing the content first, it should be crawled first by the search engines as well. To ensure that it is, I would also share your pages on social media when they go live, as it helps to index the pages much quicker.
This way, the site scraping your content should (in theory) not be able to rank for the content - or at the very least will be seen by Google as the copier of the content, while you will be seen as the originator, due to being indexed first with the canonical tag.
You can read more on canonicals with this handy Moz guide.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Long tail traffic - what is the best way to go back and add focus to repetitive long tail keywords?
Hey everybody, So, our niche doesn't have a million and a half searches per month, which makes a handle full of visitors look mighty enticing to a CMO Our price point is very high too, so to the question, is it worth taking the time to put a whole new content strategy in line for a few new visitors, the answer is yes. Now's the hard part. How on earth do I make 1,000 pages for similar topics? Is making new pages the best way to go about this? (probably so right? It's the only thing that I can see that would certainly increase likelihood of being more relevant, plus if I don't I will be missing out on the benefits of beefing up our site, AND the opportunity to more specifically answer a users query.) With phrases like "keyword" and "aftermarket keyword," the searcher is asking for two totally separate collections of results. I'm always reading about the importance of being there throughout the buyers complete purchasing /research process, which makes me think that considering doing anything other than creating unique pages is simply missing out.. Suggestions? Massive Content Strategy Help? Anybody? Thanks, TA
On-Page Optimization | | TylerAbernethy0 -
Duplicate content - what to do?
Hi, We have a whole lot of articles on our site. In total 5232 actually. The web crawler tells me that in the articles we have a lot of duplicate content. Which is sort of nonsense, since each article is unique. Ah, some might have some common paragraphs because they are recurring news about a weekly competition. But, an example: http://www.betxpert.com/artikler/bookmakere/brandvarme-ailton-snupper-topscorerprisen AND http://www.betxpert.com/artikler/bookmakere/opdaterede-odds-pa-sportschef-situationen-pa-vestegnen These are "duplicate content", however the two article texts are not the same. The menu, and the widgets are all the same, but highly relevant to the article. So what should I do? How can i rid myself of these errors? -Rasmus
On-Page Optimization | | rasmusbang0 -
Duplicate Links
Hello, I am entering sitewide navigation that will go to primary seo pages. This is really for usability, not for link juice. I'm wondering if I should still link to these very important pages in my index page's content. Or if I should consider those navigation links strong enough. If I did link in the content, then I would have more than one link to the same page on my home page. Thanks Tyler
On-Page Optimization | | tylerfraser0