Duplicate content or not? If you're using abstracts from external sources you link to
-
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google.
For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to.
So, except from a few sentences in the beginning everything is copied from other pages.
Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience.
But will this format be considered duplicate or near-duplicate content?
-
Are you going to get some sort of penalty for it? No. Duplicate content doesn't work that way unless you're just a low-quality or scraper site. Are you going to rank for a lot of keywords in the quoted text? No, probably not.
If there's value in your curation, you could in theory rank for the theme or topic that you're covering with the external quotations. This is especially true if you're pulling together hard-to-find or obscure quotations together, or combining them in an interesting/unique way.
Providing unique content is generally a good way to go in organic search, but there are plenty of aggregation sites succeeding. This was all MetaCritic had before it filled up with user reviews, but it was insanely useful. Don't let anyone tell you that content will get you penalized or something just because it can be found elsewhere. Do cite your sources and think about user comments. If you provide something uniquely valuable to the user, there are ways to make even pure duplicate content work in search.
-
Romanbond,
This is thin content/Panda kind of stuff. If your users find it valuable and outside sources link to your abstract pages, it could pass muster. It's likely though, that those pages will not build up the authority that they need to either rank well themselves or pass along link equity to those pages they link to.
-
Hmmm I would say borderline. If this was the mainstay of posts to a site, then I would be worried. However if you have lots of other content published on a regular basis that is content-rich and engaging, then I would be less worried.
If the main goal here really is for users, rather than SERPS, why not noindex, dofollow the page?
Couldn't you twist this a little though, have a unique intro at the start of the article, then a paragraph of your own thoughts on each topic - adding value and provoking thought, then a link to the topic after that? It's what I do on some of my sites, and it works well!
-
It would probably be duplicate content. The page would be useful for people who stumble upon your site, but why would Google want to rank that page over the actual sources themselves? So your best bet is to add plenty of your own content to that page, or rank the rest of your site and link to this useful resource (not expecting it to rank on its own).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Exact match domain - should i use one
i have the domain "region"familyholidays.co.uk for an upcoming site. i was pleased as its memorable and tells the user what its about. i am targetting keywords such as: region family holidays region family hotels region famliy cottages region family campsites is it something i should avoid because of potential penalties. i will be adding plenty of good content and doing all the offsite things but dont want to start with a handicap with an emd? thanks neil
White Hat / Black Hat SEO | | neilhenderson0 -
Competitor outranking you with link spam. What would be your next steps?
FYI: I've already searched the forums for previous posts on this topic and although some are helpful, they don't tend to have many responses, so I'm posting this again in the hope of more interaction from the community 😉
White Hat / Black Hat SEO | | adamlcasey
So can I please ask the community to tell me what course of action you would take, if this was happening to you? We have been ranking in position 1 for a major keyword in our space for the past 18 months. Today I logged into my Moz account and to keyword rankings to find that we have dropped to 2nd. So I placed the competitors website; who's now in 1st position, into OSE and looked under the "Just Discovered" tab. There are 258 newly discovered links, 95% of which use keywords in the anchor text!
So I reviewed the rankings for all of these other keywords being targeted and sure enough they are now dominating the top 1-3 spots for most of them. (some of which we are also attempting to rank for and have subsequently been pushed down the rankings) Their links are made up of: Forum and blog comments - always using anchor text in the links Article's posted on web 2.0 sites (Squidoo, Pen.io, Tumblr, etc) Profile page links Low quality Press Release sites Classified ad sites Bookmarking sites Article Marketing sites Our competitors sell safety solutions into the B2B market yet the topics of some of the sites where these links appear include: t-shirts sports news online marketing anti aging law christian guitars computers juke boxes Of the articles that I quickly scanned, it was clear they had been spun as they didn't read well/make sense in places. So my conclusion is that they have decided to work with a person (can't bring myself to call them an seo company) who have provided them with a typical automated link building campaign using out dated, poor seo practices that are now classified as link spam. No doubt distributed using an automated link publishing application loaded with the keyword rich anchor text links and published across any site that will take them. As far as I was aware, all of the types of links we're supposed to have be penalised by Google's Penguin & Panda updates and yet it seems they are working for them! So what steps would you take next?0 -
Are multiple domains spammy if they're similar but different
A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as: Johnsmithlandclearing.com Johnsmithdirtwork.com Johnsmithdemolition.com Johnsmithtimercompany.com Johnsmithhydroseeding.com johnsmithtreeservice.com Each business is unique enough and will cross-link to the other. My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen? I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.
White Hat / Black Hat SEO | | SteveMauldin0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
SEOLutions - Paint it White... Has any one used?
Has anyone used the tiered link building service offered by seolutions (http://seolutions.biz/store/seo-solutions/premium-solutions-paint-it-white.html)? If so, can you provide any insight into how effective it was in the long and short term? Thanks!
White Hat / Black Hat SEO | | PeterAlexLeigh0 -
Spam linking site how to report
I have a spam linking site that is generation thousans of links to my site. Even if i have a good link background, this is the only spammy i have, each week number of links comings from it increases by 500 , i know have 3000 links for that site and 1800 for other sites, but that one keeps growing What should i do, i dont want that link it is imposible to remove as webmaster does not respond
White Hat / Black Hat SEO | | maestrosonrisas0 -
"Unnatural Linking" Warning/Penalty - Anyone's company help with overcoming this?
I have a few sites where I didn't manage the quality of my vendors and now am staring at some GWT warnings for unnatural linking. I'm assuming a penalty is coming down the pipe and unfortunately these aren't my sites so looking to get on the ball with unwinding anything we can as soon as possible. Does anyone's company have experience or could pass along a reference to another company who successfully dealt with these issues? A few items coming to mind include solid and speedy processes to removing offending links, and properly dealing with the resubmission request?
White Hat / Black Hat SEO | | b2bmarketer0