Duplicate Content & www.3quarksdaily.com, why no penalty?
-
Does anyone have a theory as to why this site does not get hit with a DC penalty?
The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere.
Any theories would be greatly appreciated!
-
Thank you for taking the time to respond, and with such well thought out answer.
I suppose the original author would not be so bothered about 3 Quarks Daily as at least they link to & request readers to visit the original site for the full article, which is obviously more than The New Dawn Liberia Site.
Do you feel that creating such a site (3 Quarks Daily) as a readers resource of the best articles on a specific topic from across the web is a legitimate way to build a website (for personal pleasure not profit)? and what are your thoughts on copyright issues?
How would you feel if others re-posted your content in this way?
It is interesting that Google does not penalize duplicate content websites, and in this specific example surprising that those re-posting others content can rank higher.
(sorry for asking so many questions)
-
Hi Kevin,
before entering into your question, it is better to precise that duplicated content is not cause of penalty. We talk about it in "penalization" terms because Google tends to filter pages with duplicated content, if they are in the same site and because duplicated content waste the so called budget crawl. But when it comes to content duplicated in several sites, then we don't have a rule, even though the scraper update was meant to give an order to this kind of situation.
In the case of 3quarksdaily.com, you have to notice:
- it is a clearly stated curation content website (see http://www.3quarksdaily.com/3quarksdaily/aboutus.html )
- it references the original source correctly with an attribution link in the author name
The same could be said about http://www.thenewdawnliberia.com site, an online newspaper, that published too the same article here.
Personally, I don't think that this kind of content syndication has to be penalized.
But the most important thing to notice is that is the original source that doesn't rank first (it is 4th) for that same query! If i was its SEO I would start investigating why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we have any risk or penalty for double canonicals?
Hi all, We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C? Thanks
Algorithm Updates | | vtmoz0 -
How much content is it safe to change?
I have read that it is unsafe to change more than 20% of your site’s content in any update. The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on." Is this true, has anyone had any direct experiences of this or similar?
Algorithm Updates | | GrouchyKids0 -
Google Latest Algorithmic Change about Https & Mobile Friendliness
How effective did it prove for anyone with the latest algorithmic change google search engine made for being mobile friendly and using https (valid ssl certificate). I see a good change being made under the ecommerce category for sites being used for online shopping. Let me know if anyone observes a major difference.
Algorithm Updates | | mozexpone0 -
Can I only submit a reconsideration request if I have a penalty?
Hey guys, One of the sites I'm looking after took a hit with their rankings (particularly for one keyword that went from 6/7 to 50+) post-Penguin in May. Although, after cleaning-up the link profile somewhat we started to see some slow and steady progression in positions. The keyword that dropped to 50+ was moving upwards in advance of 20. However, a couple of weeks back, the keyword in question took another slide towards 35-40. I therefore wondered whether it would be best to submit a reconsideration request - even though the site did not receive a manual penalty. The website has a DA of 40 which more than matches a lot of the competitor websites that are ranking on first page for the aforementioned keyword. At this stage, I would have expected the site to have returned to its original ranking - four-and-a-half months after Penguin - but it hasn't. So a reconsideration request seemed logical. That said, when I came to go through the process on Webmaster Tools I was unable to find the option! Has it now been removed for sites that don't receive manual penalties?
Algorithm Updates | | Webrevolve1 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Do links count in syndicated content?
If I write a press release that goes viral and is syndicated all over do each of those links to my site in the syndications of the press release count and pass page rank with Google? Or does Google only count the link in the original press release? I heard that Google counts all the links for a time then eventually counts only one link from the original content and discounting all the other links as duplicate content. Any truth to this? Thanks mozzers! Ron10
Algorithm Updates | | Ron100 -
Moving content in to tabs
Hi, I'm kind of an SEO noobie, so please bare with me 🙂 On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs. This makes sense. We tried it and it makes navigating through the information much easier for visitors. My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab. Will this cause SEO issues? Thank you!
Algorithm Updates | | eladlachmi0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0