Copying Content With Permission
-
Hi, we received an email about a guy who wants to copy and paste our content on his website, he says he will keep all the links we put there and give us full credit for it, so besides keeping all the links on the page, which is the best way for him to give us the credit? a link to the original article? an special meta tag? what?
Thank you
PS.Our site its much more authorative than his and we get indexed within 10min from the moment we publish a page, so I don't worry about him out raking us with our own content.
-
Very controversial...duplicate content...
-
Syndication Source and and Original Source are both generally used for Google News algo at this point. For the main SERPs you would use a cross-domain rel="canonical". The problem with all of these is that they require the re-publisher to edit their html header file on a per-content basis. That is not technologically scalable for many sites so it could kill the deal. If they are willing to give you the rel canonical tag pointing to your domain, that is best (especially if the story includes links to your site). Otherwise, getting your site indexed first and making sure their links to your site int he copy are followable should do the trick.
Don't let them publish every single story you write though. You want readers to have a reason to come subscribe to your site if they read something on the other site.
-
Thanks Matt, that's great stuff! I always keep track of what gets indexed. And yes, choosing who to share the content with is for sure very important, I would not want a content farm related to our site in any way, specially now
-
Hi Andres,
As long as you're getting direct followed links back to your original article, then that should be enough. A couple of other things though:
- Even though you're confident you'll be indexed before the other site, I'd still implement some embargo time on when they can publish on their site as a fallback.
- Take a look at the site itself that will be linking to you... is it something you a) want your content associated with, and b) want your link profile associated with?
Some resources you may be interested in:
[1] http://www.seomoz.org/blog/whiteboard-friday-content-technology-licensing
[2] http://googlewebmastercentral.blogspot.com/2006/12/deftly-dealing-with-duplicate-content.html (deals with syndication)
[3] http://www.mattcutts.com/blog/duplicate-content-question/
-
If this happens often you should consider using http://www.tynt.com/ and modify your attribution settings to suit your needs.
-
I have not tested the "syndication-source" or "original-source" tags personally but I have seen a very good case of credit syndication being used at http://www.privatecloud.com
Almost 95% of the content on this website is duplicate word for word of the original article located on the third party websites. I have been tracking this site for almost 6 months now and have seen several instances of duplicate pages (with credit to original article) indexed and ranking on Google SERPs.
Using this example I would agree that your technique should work fine.
-
Hi Sameer, I am not sure about using a canonical tag since its not our site and maybe there will be more content than just ours, he ask permission just to copy and paste so yes its dupe and we wanted index for the backlinks, this is my idea:
http://googlenewsblog.blogspot.com/2010/11/credit-where-credit-is-due.html
syndication-source indicates the preferred URL for a syndicated article. If two versions of an article are exactly the same, or only very slightly modified, we're asking publishers to use syndication-source to point us to the one they would like Google News to use. For example, if Publisher X syndicates stories to Publisher Y, both should put the following metatag on those articles:
let me know what you think.
-
Hey Andrés,
As a general rule, content is considered duplicate only if it is more than 35-40% copy of the original. If the person wants to copy your website word for word then here are the few ways you can avoid duplicate content penalty
- Rel canonical - Add a rel canonical tag to the section of the non-canonical page. This will inform Google on what page is the most relevant to be indexed (your webpages in this case).
2. Reduce duplication - Ask the person to modify the content and rewrite in their own words. DupeCop is a good tool that will allow you to compare two content pieces and measure the duplication percentage. (Don't use respun content always rewrite in your own words.)
3. NoIndex Meta Robot tags - If they are not willing to change the page content then you can ask them to prevent those pages getting spidered by adding a noindex meta tags.
Best
Sameer
-
So the best way to get the credit from the article are just the links? is there any special tag? something like meta name=syndication-source? no need?
And yes, you are right its manual syndication and he will keep all the links.
thank you Gianluca
-
Hi...
what you describe is somehow a sort of syndication of your content. A manual one, but still a syndication.
I believe that the guy, when he says he will give you full credit for the content, was meaning an optimized full link to the original article.
If it is so, I would say yes to that guy. If not, ask him to do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Java, No Content..?
Hello Mozers! 🙂 I have a question for you: I am working on a site and while doing an audit I disabled JavaScript via the Web Developer plugin for Chrome. The result is that instead of seeing the page content, I see the typical “loading circle” but nothing else. I imagine this not a good thing but what does this implies technically from crawler perspective? Thanks
Technical SEO | | Pipistrella0 -
Fullsite=true coming up as duplicate content?
Hello, I am new to the fullsite=true method of mobile site to desktop site, and have recently found that about 50 of the instances in which I added fullsite=true to links from our blog show as a duplicate to the page that it is pointing to? Could someone tell me why this would be? Do I need to add some sort of rel=canonical to the main page (non-fullsite=true) or how should I approach this? Thanks in advance for your help! L
Technical SEO | | lfrazer0 -
Duplicate Content - Mobile Site
We think that a mobile version of our site is causing a duplicate content issue; what's the best way to stop the mobile version being indexed. Basically the site forwards mobile users to "/mobile" which is just a mobile optimised version of the original site. Is it best to block the /mobile folder from being crawled?
Technical SEO | | nsmith7870 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
How do you properly handle syndicated content?
The same piece of content is pulled in and presented (syndicated) within a frame on different web sites (owned by the same company). However, I would like only one web site to rank on Google's search results for that content. How do I set this up? Thanks, claudia
Technical SEO | | claudmar0 -
Is this considered as duplicate content?
One of my clients has a template page they have used repeatedly each time they have a new news item. The template includes a two-paragraph customer quote/testimonial for the company. So, they now have 100+ pages with the same customer quote. The rest of the page content / body copy is unique. Is there any likelihood of this being considered duplicate content?
Technical SEO | | bjalc20110