Does posting a source to the original content avoid duplicate content risk?
-
A site I work with allows registered user to post blog posts (longer articles).
Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?
Thanks!
-
I don't know what Roger says, but I believe that followed links on noindex pages will pass PageRank, anchor text and other link benefits. Your instructions are to "no index" but the page will still be crawled.
-
Hi EGOL.
If you noindex pages and other sites link to them, do you benefit from that or not?
Do you see any pagerank on those, that are old enough to show it?
What does Roger say about those?
-
I publish other people's content. That caused a Panda problem about a year ago - which I was able to recover from by noindexing those pages. Now I noindex / follow any content that I publish that appears on another website.
The articles that I write are published on my own site only.
-
I'm concerned about what's best for my site -and would therefore not post other peoples content - so i've never had to deal with this
I guess if I owned both sites i would prefer to cross canonical the duped pages to my other site If i didn't own the other site i would probably just opt to noindex follow that page i guess
-
The last question in the text is......
Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible. -
Egol,
The Matt Cutts video seems to say you can't canonicalize between two totally different domains. So, we couldn't use a canonical for that.
-
Canonicalling them will give the benefit to the author's original page. It does not have benefit for you.
If you want them to rel=canonical for you then it is good to do it for them.
-
If you want to avoid panda with content on your own site then you can noindex, follow those pages.
Your visitors will be able to use them but they will not appear in the search engines.
-
Hey Egol, What is the benefit of canonicalling to them over just meta noindex,following the page?
-
So, you're not saying rel canonical to their page?
What if we just no-follow pages on our site that author originally published on their site? Right now we link to it as orginally published on ....
I'm trying to avoid a Panda penalty for non-unique blog posts reposted on our site.
-
I have used rel=canonical to reduce duplicate content risk. However, more important, the rel=canonical gives credit to the page where it points.
One problem with guest posting is that to reduce duplicate content risk and transfer credit to your own site, you must have the site owners cooperation.
Of course, you can get author credit by linking the post to your Google+ profile - if you think that has value.
-
Hi,
Thanks, Egol
So, on a page of ours where someone re-posts their blog post on our site, we'd add a canonical tag on our page to point to their original page? That would be a canonical tag between two different domains. I didn't think that was okay.
And, if we did that, we wouldn't be risking some kind of Panda duplicate content penalty?
Thanks!
-
"Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?"
No. To prevent that you need to use the rel=canonical.
See Matt Cutts video here....
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Competitors with duplicate sites for backlinks
Hello all, In the last few months, my company has seen some keywords we historically rank well for fall off the first page, and there are a couple competitors that have appeared that use backlinks from seemingly the same site. For fairness, our site has slow page load speeds that we are working on changing, as well as not being mobile friendly yet. The sites that are ranking are mobile friendly and load fast, but we have heaps of other words still ranking well, and I'm more curious about this methodology. For example, these two pages: http://whiteboards.com.au/
White Hat / Black Hat SEO | | JustinBSLW
http://www.glasswhiteboards.com.au/ In OSE, glasswhiteboards has the majority of links from whiteboards, and the content between the sites is the same. My page has higher domain authority & page authority, but less backlinks. However, if you take away the backlinks from the duplicate site, they are the same. Isn't this type of content supposed to be flagged? My question is about whether this kind of similar site on different domains is a good idea to build links, as all my research shows that it's poor in the long run, but it seems to be working with these guys. Another group of sites that has been killing us uses this same method, with multiple sites that look the same that all link to each other to build up backlinks. These sites do have different content. It seems instead of building different categories within their own site, they have purchased multiple domains that act as their categories. Here's just a few: http://www.lockablenoticeboards.com.au/
http://www.snapperframes.com/
http://www.snapperdisplay.com.au/
http://www.light-box.com.au/
http://www.a-frame-signs.com.au/
http://www.posterhangers.com.au/0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Guest post linking only to good content
Hello, We're thinking of doing guest posting of the following type: 1. The only link is in the body of the guest post pointing to our most valuable article. 2. It is not a guest posting site - we approached them to help with content, they don't advertise guest posting. They sometimes use guest posting if it's good content. 3. It is a clean site - clean design, clean anchor text profile, etc. We have 70 linking root domains. We want to use the above tactics to add 30 more links. Is this going to help us on into the future of Google (We're only interested in long term)? Is 30 too many? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0