Duplicate Content
-
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years.
How much damage could this do to blog A for Google results? Any opinions?
-
Removing any duplicate text is absolutely essential, as this could potentially negatively affect your business's organic SEO, do you have any duplicated text?
-
Thanks for the response Peter re: the original post.
We are very convinced at this point the issue isn't a technical one. We're not sure however if there's an issue with that duplicate content site we found stealing some of our articles, or as you mentioned a quality score issue. We're approaching it as a need to re-group around the quality issue for now, and monitor results over time. We've identified several areas for improvement in that regard.
This stuff is so frustrating to be honest. I get why Google can't show their cards, but the complete lack of transparency or ability to get some feedback from them makes this a difficult game.
Thanks again for the response, much appreciated.
-
CYNOT: I saw the original question via email (I'll avoid details in the public answer), and unfortunately I'm not seeing any clear signs of technical issues with the original content. This looks more like an aggressive filter than a penalty, but it's really hard to tell if the filter is a sign of quality issues or if Google is treating the wrong site as a duplicate.
-
Unfortunately, a lot of it does depend on the relative authority of the sites. People scrape (including some bots, which do it almost immediately) Moz posts all the time, and they rank, but they don't have nearly our link profile or other ranking signals, and so we don't worry about it. For a smaller site with a relatively new or weak link profile, though, it is possible for a stronger site to outrank you on your own content.
Google does try to look at cache dates and other signals, but a better-funded site can often get indexed more quickly as well. It's rare for this to do serious damage, but it can happen. As Balachandar said, at that point you may have to resort to DMCA take-down requests and other legal actions. Ultimately, that becomes a cost/benefit trade-off, as legal action is going to take time and money.
There's no technical tricks (markup, etc.) to tell Google that a page is the source, although there are certainly tactics, like maintaining good XML sitemaps, that can help Google find your new content more quickly. Of course, you also want to be the site that has that stronger link profile, regardless of whether or not someone is copying you.
-
It will affect your ranking if the second blog steals your content. If the second blog which had stealed your content have high DA, your content will be under-valued. Google updating the algorithms by analyzing which website posts the content in web(date analysis) to solve this problem. You can see traffic drops as an indication to identify that the page is duplicated by some other blogs. If you have big website and many blog posts, you can use DMCA which takes care of all the things. If you have any questions, feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
May Faceted Navigation via ajax #parameter cause duplicated content issues?
We are going to implement a faceted navigation for an ecommerce site of about 1000 products.
Intermediate & Advanced SEO | | lcourse
Faceted navigation is implemented via ajax/javascript which adds to the URL a large number of #parameters.
Faceted pages are canonicalizing to page without any parameters. We do not want google to index any of the faceted pages at this point. Will google include pages with #parameters in their index?
Can I tell google somehow to ignore #parameters and not to index them?
Could this setup cause any SEO problems for us in terms of crawl bandwidth and or link equity?0 -
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Duplicate Content That Isn't Duplicated
In Moz, I am receiving multiple messages saying that there is duplicate page content on my website. For example, these pages are being highlighted as duplicated: https://www.ohpopsi.com/photo-wallpaper/made-to-measure/pop-art-graffiti/farm-with-barn-and-animals-wall-mural-3824 and https://www.ohpopsi.com/photo-wallpaper/made-to-measure/animals-wildlife/little-elephants-garden-seamless-pattern-wall-mural-3614. As you can see, both pages are different products, therefore I can't apply a 301 redirect or canonical tag. What do you suggest?
Intermediate & Advanced SEO | | e3creative0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Microsite as a stand-alone site under one domain and sub-domained under another: duplicate content penalty?
We developed and maintain a microsite (example: www.coolprograms.org) for a non-profit that lives outside their main domain name (www.nonprofit-mainsite.org) and features content related to a particular offering of theirs. They are utilizing a Google Grant to run AdWords campaigns related to awareness. They currently drive traffic from the AdWords campaigns to both the microsite (www.coolprograms.org) and their main site (www.nonprofit-mainsite.org). Google recently announced a change in their policy regarding what domains a Google Grant recipient can send traffic to via AdWords: https://support.google.com/nonprofits/answer/1657899?hl=en. The ads must all resolve to one root domain name (nonprofit-mainsite.org). If we were to subdomain the microsite (example: coolprograms.nonprofit-mainsite.org) and keep serving the same content via the microsite domain (www.coolprograms.org) is there a risk of being penalized for duplicate content? Are there other things we should be considering?
Intermediate & Advanced SEO | | marketing-iq0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Multiple cities/regions websites - duplicate content?
We're about to launch a second site for a different, neighbouring city in which we are going to setup a marketing campaign to target sales in that city (which will also have a separate office there as well). We are going to have it under the same company name, but different domain name and we're going to do our best to re-write the text content as much as possible. We want to avoid Google seeing this as a duplicate site in any way, but what about: the business name the toll free number (which we would like to have same on both sites) the graphics/image files (which we would like to have the same on both sites) site structure, coding styles, other "forensic" items anything I might not be thinking of... How are we best to proceed with this? What about cross-linking the sites?
Intermediate & Advanced SEO | | webdesignbarrie0