Duplicate Content
-
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years.
How much damage could this do to blog A for Google results? Any opinions?
-
Removing any duplicate text is absolutely essential, as this could potentially negatively affect your business's organic SEO, do you have any duplicated text?
-
Thanks for the response Peter re: the original post.
We are very convinced at this point the issue isn't a technical one. We're not sure however if there's an issue with that duplicate content site we found stealing some of our articles, or as you mentioned a quality score issue. We're approaching it as a need to re-group around the quality issue for now, and monitor results over time. We've identified several areas for improvement in that regard.
This stuff is so frustrating to be honest. I get why Google can't show their cards, but the complete lack of transparency or ability to get some feedback from them makes this a difficult game.
Thanks again for the response, much appreciated.
-
CYNOT: I saw the original question via email (I'll avoid details in the public answer), and unfortunately I'm not seeing any clear signs of technical issues with the original content. This looks more like an aggressive filter than a penalty, but it's really hard to tell if the filter is a sign of quality issues or if Google is treating the wrong site as a duplicate.
-
Unfortunately, a lot of it does depend on the relative authority of the sites. People scrape (including some bots, which do it almost immediately) Moz posts all the time, and they rank, but they don't have nearly our link profile or other ranking signals, and so we don't worry about it. For a smaller site with a relatively new or weak link profile, though, it is possible for a stronger site to outrank you on your own content.
Google does try to look at cache dates and other signals, but a better-funded site can often get indexed more quickly as well. It's rare for this to do serious damage, but it can happen. As Balachandar said, at that point you may have to resort to DMCA take-down requests and other legal actions. Ultimately, that becomes a cost/benefit trade-off, as legal action is going to take time and money.
There's no technical tricks (markup, etc.) to tell Google that a page is the source, although there are certainly tactics, like maintaining good XML sitemaps, that can help Google find your new content more quickly. Of course, you also want to be the site that has that stronger link profile, regardless of whether or not someone is copying you.
-
It will affect your ranking if the second blog steals your content. If the second blog which had stealed your content have high DA, your content will be under-valued. Google updating the algorithms by analyzing which website posts the content in web(date analysis) to solve this problem. You can see traffic drops as an indication to identify that the page is duplicated by some other blogs. If you have big website and many blog posts, you can use DMCA which takes care of all the things. If you have any questions, feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
Supplier Videos & Duplicate Content
Hi, We have some supplier videos the product management want to include on these product pages. I am wondering how detrimental this is for SEO & the best way to approach this. Do we simply embed the supplier YouTube videos, or do we upload them to our YouTube - referencing the original content & then embed our YouTube videos? Thank you!
Intermediate & Advanced SEO | | BeckyKey0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Product Page on Eccomerce Site ranking very poorly - Unique Product description but duplicate content on other tabs.
Hi All, I have a query regarding my Product pages on my eCommerce site. I have unique Product descriptions but some of the other page content on the other tabs i.e Hire Terms , Delivery , About the Hire Company - Are duplicated across ALL my products. Is that okay or how should I deal with them ? See example url of one of my products below below - http://goo.gl/aSFPqP My products currently rank very badly... 200 + so Any advice would be greatly appreciated thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0