Duplicate Content & www.3quarksdaily.com, why no penalty?
-
Does anyone have a theory as to why this site does not get hit with a DC penalty?
The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere.
Any theories would be greatly appreciated!
-
Thank you for taking the time to respond, and with such well thought out answer.
I suppose the original author would not be so bothered about 3 Quarks Daily as at least they link to & request readers to visit the original site for the full article, which is obviously more than The New Dawn Liberia Site.
Do you feel that creating such a site (3 Quarks Daily) as a readers resource of the best articles on a specific topic from across the web is a legitimate way to build a website (for personal pleasure not profit)? and what are your thoughts on copyright issues?
How would you feel if others re-posted your content in this way?
It is interesting that Google does not penalize duplicate content websites, and in this specific example surprising that those re-posting others content can rank higher.
(sorry for asking so many questions)
-
Hi Kevin,
before entering into your question, it is better to precise that duplicated content is not cause of penalty. We talk about it in "penalization" terms because Google tends to filter pages with duplicated content, if they are in the same site and because duplicated content waste the so called budget crawl. But when it comes to content duplicated in several sites, then we don't have a rule, even though the scraper update was meant to give an order to this kind of situation.
In the case of 3quarksdaily.com, you have to notice:
- it is a clearly stated curation content website (see http://www.3quarksdaily.com/3quarksdaily/aboutus.html )
- it references the original source correctly with an attribution link in the author name
The same could be said about http://www.thenewdawnliberia.com site, an online newspaper, that published too the same article here.
Personally, I don't think that this kind of content syndication has to be penalized.
But the most important thing to notice is that is the original source that doesn't rank first (it is 4th) for that same query! If i was its SEO I would start investigating why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain location is a ranking factor? Back links & website?
If a website trying to rank in US and it has received many back-links from domains hosting from other countries; how it will impact website ranking? Can a website hosted in country will rank well in other country? How much the hosted location matters? Like....domain hosted in Germany but trying to rank in US?
Algorithm Updates | | vtmoz0 -
Anyone experience google penalties for full-screen pop-ups?
Although we always recommend against onload pop-ups for clients, (we feel the effect the user experience) we do have a few clients that insist on them. I was reading this article the other day https://searchenginewatch.com/2016/05/17/how-do-i-make-sure-my-site-is-mobile-friendly/ which lead me to https://support.google.com/webmasters/answer/6101188 and I'm happy to see that Google is going to consider these types of content a downgrade when it comes to rank. My question is 2 fold: Has anyone experienced a drop in organic traffic on mobile due to this update? and do you think this will include user triggered content like photo galleries, bookings, email sign ups? We haven't noticed any drops yet but it is something we will be keeping a close eye on in the next little while. Let's hear what the community has to say 🙂
Algorithm Updates | | VERBInteractive1 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
Google indexing site content that I did not wish to be indexed
Hi is it pretty standard for Google to index content that you have not specifically asked them to index i.e. provided them notification of a page's existence. I have just been alerted by 'Mention' about some new content that they have discovered, the page is on our site yes and may be I should have set it to NO INDEX but the page only went up a couple of days ago and I was making it live so that someone could look at it and see how the page was going to look in its final iteration. Normally we go through the usual process of notifying Google via GWMT, adding it to our site map.xml file, publishing it via our G+ stream and so on. Reviewing our Analytics it looks like there has been no traffic to this page yet and I know for a fact there are no links to this page. I am surprised at the speed of the indexation, is it a example of brand mention? Where an actual link is now no longer required? Cheers David
Algorithm Updates | | David-E-Carey0 -
Mozcast: 5th & 9th May - what's shaking up?
What's going on at the moment, i can't find any info on the 5/9th May but Mozcast is showing some movement. Anyone have any info? Cheers
Algorithm Updates | | Bondara0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0