Does posting a source to the original content avoid duplicate content risk?
-
A site I work with allows registered user to post blog posts (longer articles).
Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?
Thanks!
-
I don't know what Roger says, but I believe that followed links on noindex pages will pass PageRank, anchor text and other link benefits. Your instructions are to "no index" but the page will still be crawled.
-
Hi EGOL.
If you noindex pages and other sites link to them, do you benefit from that or not?
Do you see any pagerank on those, that are old enough to show it?
What does Roger say about those?
-
I publish other people's content. That caused a Panda problem about a year ago - which I was able to recover from by noindexing those pages. Now I noindex / follow any content that I publish that appears on another website.
The articles that I write are published on my own site only.
-
I'm concerned about what's best for my site -and would therefore not post other peoples content - so i've never had to deal with this
I guess if I owned both sites i would prefer to cross canonical the duped pages to my other site If i didn't own the other site i would probably just opt to noindex follow that page i guess
-
The last question in the text is......
Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible. -
Egol,
The Matt Cutts video seems to say you can't canonicalize between two totally different domains. So, we couldn't use a canonical for that.
-
Canonicalling them will give the benefit to the author's original page. It does not have benefit for you.
If you want them to rel=canonical for you then it is good to do it for them.
-
If you want to avoid panda with content on your own site then you can noindex, follow those pages.
Your visitors will be able to use them but they will not appear in the search engines.
-
Hey Egol, What is the benefit of canonicalling to them over just meta noindex,following the page?
-
So, you're not saying rel canonical to their page?
What if we just no-follow pages on our site that author originally published on their site? Right now we link to it as orginally published on ....
I'm trying to avoid a Panda penalty for non-unique blog posts reposted on our site.
-
I have used rel=canonical to reduce duplicate content risk. However, more important, the rel=canonical gives credit to the page where it points.
One problem with guest posting is that to reduce duplicate content risk and transfer credit to your own site, you must have the site owners cooperation.
Of course, you can get author credit by linking the post to your Google+ profile - if you think that has value.
-
Hi,
Thanks, Egol
So, on a page of ours where someone re-posts their blog post on our site, we'd add a canonical tag on our page to point to their original page? That would be a canonical tag between two different domains. I didn't think that was okay.
And, if we did that, we wouldn't be risking some kind of Panda duplicate content penalty?
Thanks!
-
"Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?"
No. To prevent that you need to use the rel=canonical.
See Matt Cutts video here....
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Referral source not indexed or showing up in GSC
I've been doing a lot of research about this and have not been able to find an answer just yet. Google analytics is showing over 43k referrals from about 35 different spam sources. I checked the hostname thinking that they were ghost referrals and I was surprised to see that they all show our domain so that part is disqualified. The next thing I did was to look at the referral path to look at the pages that were pointing to the site and when I clicked to launch the link the window loaded YouTube or did not load at all. After doing a bit of research I came across **Disavowing Links, **at first it sounded like the perfect solution for this, but after reading all the warnings that everyone gives I decided to spend more time researching and to use that as a last resource. I proceeded to check Google Search Console to identify those backlinks and to make sure they were coming up there as well. To my surprise, none of these links show up in GSC. Neither for the www or the non-www property. I have decided to avoid disavowing the links before making sure that this is the correct thing to do. Although it may still seem like it is, I want to ask for an expert opinion or if anyone else has experienced this. If GSC doesn't see them it means that Google is not indexing them, my problem is that GA still sees them and that concerns me. I don't want this to affect our site by getting penalized, or by losing ranking. Please help!
White Hat / Black Hat SEO | | dbmiglpz0 -
Plugin to duplicate CMS pages, changing the location
Hi all, We have recently noticed a rise in local business websites using a plugin to duplicate hundreds of pages changing only the location in the h1 tag and the page description, we're pretty sure this is a black hat technique allowing them to rank for all locations (although the duplicate page content must not be doing them any favours). An example of this is http://www.essexcarrecovery.co.uk We would like to know what plugin they are using as we think there may be better ways to use this, we may be able to create original location pages faster than we do now? Also why does not seem to be too detrimental to the businesses SEO as surely this method should be damaging?
White Hat / Black Hat SEO | | birdmarketing0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640