Does posting a source to the original content avoid duplicate content risk?
-
A site I work with allows registered user to post blog posts (longer articles).
Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?
Thanks!
-
I don't know what Roger says, but I believe that followed links on noindex pages will pass PageRank, anchor text and other link benefits. Your instructions are to "no index" but the page will still be crawled.
-
Hi EGOL.
If you noindex pages and other sites link to them, do you benefit from that or not?
Do you see any pagerank on those, that are old enough to show it?
What does Roger say about those?
-
I publish other people's content. That caused a Panda problem about a year ago - which I was able to recover from by noindexing those pages. Now I noindex / follow any content that I publish that appears on another website.
The articles that I write are published on my own site only.
-
I'm concerned about what's best for my site -and would therefore not post other peoples content - so i've never had to deal with this
I guess if I owned both sites i would prefer to cross canonical the duped pages to my other site If i didn't own the other site i would probably just opt to noindex follow that page i guess
-
The last question in the text is......
Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible. -
Egol,
The Matt Cutts video seems to say you can't canonicalize between two totally different domains. So, we couldn't use a canonical for that.
-
Canonicalling them will give the benefit to the author's original page. It does not have benefit for you.
If you want them to rel=canonical for you then it is good to do it for them.
-
If you want to avoid panda with content on your own site then you can noindex, follow those pages.
Your visitors will be able to use them but they will not appear in the search engines.
-
Hey Egol, What is the benefit of canonicalling to them over just meta noindex,following the page?
-
So, you're not saying rel canonical to their page?
What if we just no-follow pages on our site that author originally published on their site? Right now we link to it as orginally published on ....
I'm trying to avoid a Panda penalty for non-unique blog posts reposted on our site.
-
I have used rel=canonical to reduce duplicate content risk. However, more important, the rel=canonical gives credit to the page where it points.
One problem with guest posting is that to reduce duplicate content risk and transfer credit to your own site, you must have the site owners cooperation.
Of course, you can get author credit by linking the post to your Google+ profile - if you think that has value.
-
Hi,
Thanks, Egol
So, on a page of ours where someone re-posts their blog post on our site, we'd add a canonical tag on our page to point to their original page? That would be a canonical tag between two different domains. I didn't think that was okay.
And, if we did that, we wouldn't be risking some kind of Panda duplicate content penalty?
Thanks!
-
"Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?"
No. To prevent that you need to use the rel=canonical.
See Matt Cutts video here....
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do content copycats (plagiarism) hurt original website rankings?
Hi all, Found some websites stolen our content and using the same sentences in their website pages. Does this content hurt our website rankings? Their DA is low, still we are worried about the damage about this plagiarism. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
Using Redirects To Avoid Penalties
A quick question, born out of frustration! If a webpage has been penalised for unnatural links, what would be the effects of moving that page to a new URL and setting up a 301 redirect from the old penalised page to the new page? Will Google treat the new page as ‘non-penalised’ and restore your rankings? It really shouldn’t work, but I’m convinced (although not certain) that our clients competitor has done this, with great effect! I suppose you could also achieve this using canonicalisation too! Many thanks in advance, Lee.
White Hat / Black Hat SEO | | Webpresence0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0