Duplicate Content & www.3quarksdaily.com, why no penalty?
-
Does anyone have a theory as to why this site does not get hit with a DC penalty?
The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere.
Any theories would be greatly appreciated!
-
Thank you for taking the time to respond, and with such well thought out answer.
I suppose the original author would not be so bothered about 3 Quarks Daily as at least they link to & request readers to visit the original site for the full article, which is obviously more than The New Dawn Liberia Site.
Do you feel that creating such a site (3 Quarks Daily) as a readers resource of the best articles on a specific topic from across the web is a legitimate way to build a website (for personal pleasure not profit)? and what are your thoughts on copyright issues?
How would you feel if others re-posted your content in this way?
It is interesting that Google does not penalize duplicate content websites, and in this specific example surprising that those re-posting others content can rank higher.
(sorry for asking so many questions)
-
Hi Kevin,
before entering into your question, it is better to precise that duplicated content is not cause of penalty. We talk about it in "penalization" terms because Google tends to filter pages with duplicated content, if they are in the same site and because duplicated content waste the so called budget crawl. But when it comes to content duplicated in several sites, then we don't have a rule, even though the scraper update was meant to give an order to this kind of situation.
In the case of 3quarksdaily.com, you have to notice:
- it is a clearly stated curation content website (see http://www.3quarksdaily.com/3quarksdaily/aboutus.html )
- it references the original source correctly with an attribution link in the author name
The same could be said about http://www.thenewdawnliberia.com site, an online newspaper, that published too the same article here.
Personally, I don't think that this kind of content syndication has to be penalized.
But the most important thing to notice is that is the original source that doesn't rank first (it is 4th) for that same query! If i was its SEO I would start investigating why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Keeps Dropping & FRED
Hi Moz! I've seen a big drop in Domain Authority 31 > 22 recently. I need a plan of what to sort out first, here are the points I know we need to improve: Page Speed Quality content - guides, blogs, videos Better UX experience to improve page engagement Backlinks - quality earned links & improvement of presence on social media This is our site http://www.key.co.uk/en/key/ I am the only SEO, with a small content team - who only really work on adding new products to the site. Our dev team are in France and we can be restricted by them. But I'm worried & I need a plan of what to tackle first to help improve this. We also saw keywords drop out in March - I'm assuming after Fred, some keywords aren't ones I would worry about, but then some are - for example - http://www.key.co.uk/en/key/dollies-load-movers-door-skates this page ranked at position 6 for Dollies - now dropped out altogether. Any ideas are welcome - help 🙂
Algorithm Updates | | BeckyKey2 -
Does cached duplicate content hurts seo by Google
If we have duplicate content or pages cached in Google which has been indexed months back, still it hurts the original pages? Old URLs with cache can be seen now in Google when we search for the same URLs.
Algorithm Updates | | vtmoz0 -
Condensing content for web site redesign
We're working on a redesign and are wondering if we should condense some of the content (as recommended by an agency), and if so, how that will affect our organic efforts. Currently a few topics have individual pages for each section, such as (1) Overview (2) Symptoms and (3) Treatment. For reference, the site has a similar structure to http://www.webmd.com/heart-disease/guide/heart-disease-overview-fact. Our agency has sent us over mock-ups which show these topics being condensed into one and using a script/AJAX to display only the content that is clicked on. Knowing this, if we were to choose this option, that would result in us having to implement redirects because only one page would exist, instead of all three. Can anyone provide insight into whether we should keep the topic structure as is, or if we should take the agency's advice and merge all the topic content? *Note: The reason the agency is pushing for the merging option is because they say it helps with page load time. Thank you in advance for any insight! Tcd5Wo1.jpg
Algorithm Updates | | ATShock1 -
Duplicate Content?
My client is a manufacturers representative for highly technical controls. The manufacturers do not sell their products directly, relying on manufacturers reps to sell and service them. Most but not all of them publish their specs on their sites, usually in PDF only. As a service to our customers and with permission of the manufacturers we publish the manufacturers specs on our site for our customers in HTML with images and downloadable PDF's — this constitutes our catalogue. The pages are lengthy and technical, and are pretty much the opposite of thin content. The URLS for these (technical) queries rank well, so Google doesn't seem to mind. Does this constitute duplicate content and can we be penalized for it?
Algorithm Updates | | waynekolenchuk0 -
Content, for the sake of the search engines
So we all know the importance of quality content for SEO; providing content for the user as opposed to the search engines. It used to be that copyrighting for SEO was treading the line between readability and keyword density, which is obviously no longer the case. So, my question is this, for a website which doesn't require a great deal of content to be successful and to fullfil the needs of the user, should we still be creating relavent content for the sake of SEO? For example, should I be creating content which is crawlable but may not actually be needed / accessed by the user, to help improve rankings? Food for thought 🙂
Algorithm Updates | | underscorelive0 -
How can we start to improve Domain MozRank & MozTrust for our website?
A simple question maybe, but how and where do we start if we want to improve our 'Domain MozRank & Moztrust', 'assuming of course that by improving both these we will improve our rankings with Google plus sales?
Algorithm Updates | | ewanTHH0 -
How to retain those rankings gained from fresh content...
Something tells me I know the answer to this question already but I'd always appreciate the advice of fellow professionals. So.....fresh content is big now in Google, and i've seen some great examples of this. When launching a new product or unleashing (yes unleashing) a new blog post I see our content launches itself into the rankings for some fairly competitive terms. However after 1-2 weeks these newly claimed rankings begin to fade from the lime light. So the question is, what do I need to do to retain these rankings? We're active on social media tweeting, liking, sharing and +1ing our content as well as working to create exciting and relevant content via external sources. So far all this seems to have do is slow the fall from grace. Perhaps this is natural. But i'd love to hear your thoughts, even if it is just keep up the hard work.
Algorithm Updates | | RobertChapman1 -
To use the same content just changing the keywords could be seen as duplicate content?
I want to offer the same service or product in many different cities, so instead of creating a new content for each city what I want to do it to copy the content already created for the product and service of a city and then change the name of the city and create a new url inside my website for each city. for example let say I sell handmade rings in the USA, but I want o target each principal city in the USA, so I have want to have a unque url for ecxh city so for example for Miami I want to have www.mydomain.com/handmade-rings-miami and for LA the url would be www.mydomain.com/handmade-rings-la Can I have the same content talking about the handmade rings and just change the keywords and key phrases? or this will count as a duplicate content? content: TITLE: Miami Handmade Rings URL :www.mydomain.com/handmade-rings-miami Shop Now handmade rings in Miami in our online store and get a special discount in Miami purchases over $50 and also get free shipping on Miami Local address... See what our Miami handmade rings clients say about our products.... TITLE: LA Handmade Rings URL: www.mydomain.com/handmade-rings-la Shop Now handmade rings in LA in our online store and get a special discount in LA purchases over $50 and also get free shipping on LA Local address... See what our LA handmade rings clients say about our products.... There are more than 100 location in the country I want to do this, so that is why I want to copy paste and replace.. Thanks in advance, David Orion
Algorithm Updates | | sellonline1230