Copying Content With Permission
-
Hi, we received an email about a guy who wants to copy and paste our content on his website, he says he will keep all the links we put there and give us full credit for it, so besides keeping all the links on the page, which is the best way for him to give us the credit? a link to the original article? an special meta tag? what?
Thank you
PS.Our site its much more authorative than his and we get indexed within 10min from the moment we publish a page, so I don't worry about him out raking us with our own content.
-
Very controversial...duplicate content...
-
Syndication Source and and Original Source are both generally used for Google News algo at this point. For the main SERPs you would use a cross-domain rel="canonical". The problem with all of these is that they require the re-publisher to edit their html header file on a per-content basis. That is not technologically scalable for many sites so it could kill the deal. If they are willing to give you the rel canonical tag pointing to your domain, that is best (especially if the story includes links to your site). Otherwise, getting your site indexed first and making sure their links to your site int he copy are followable should do the trick.
Don't let them publish every single story you write though. You want readers to have a reason to come subscribe to your site if they read something on the other site.
-
Thanks Matt, that's great stuff! I always keep track of what gets indexed. And yes, choosing who to share the content with is for sure very important, I would not want a content farm related to our site in any way, specially now
-
Hi Andres,
As long as you're getting direct followed links back to your original article, then that should be enough. A couple of other things though:
- Even though you're confident you'll be indexed before the other site, I'd still implement some embargo time on when they can publish on their site as a fallback.
- Take a look at the site itself that will be linking to you... is it something you a) want your content associated with, and b) want your link profile associated with?
Some resources you may be interested in:
[1] http://www.seomoz.org/blog/whiteboard-friday-content-technology-licensing
[2] http://googlewebmastercentral.blogspot.com/2006/12/deftly-dealing-with-duplicate-content.html (deals with syndication)
[3] http://www.mattcutts.com/blog/duplicate-content-question/
-
If this happens often you should consider using http://www.tynt.com/ and modify your attribution settings to suit your needs.
-
I have not tested the "syndication-source" or "original-source" tags personally but I have seen a very good case of credit syndication being used at http://www.privatecloud.com
Almost 95% of the content on this website is duplicate word for word of the original article located on the third party websites. I have been tracking this site for almost 6 months now and have seen several instances of duplicate pages (with credit to original article) indexed and ranking on Google SERPs.
Using this example I would agree that your technique should work fine.
-
Hi Sameer, I am not sure about using a canonical tag since its not our site and maybe there will be more content than just ours, he ask permission just to copy and paste so yes its dupe and we wanted index for the backlinks, this is my idea:
http://googlenewsblog.blogspot.com/2010/11/credit-where-credit-is-due.html
syndication-source indicates the preferred URL for a syndicated article. If two versions of an article are exactly the same, or only very slightly modified, we're asking publishers to use syndication-source to point us to the one they would like Google News to use. For example, if Publisher X syndicates stories to Publisher Y, both should put the following metatag on those articles:
let me know what you think.
-
Hey Andrés,
As a general rule, content is considered duplicate only if it is more than 35-40% copy of the original. If the person wants to copy your website word for word then here are the few ways you can avoid duplicate content penalty
- Rel canonical - Add a rel canonical tag to the section of the non-canonical page. This will inform Google on what page is the most relevant to be indexed (your webpages in this case).
2. Reduce duplication - Ask the person to modify the content and rewrite in their own words. DupeCop is a good tool that will allow you to compare two content pieces and measure the duplication percentage. (Don't use respun content always rewrite in your own words.)
3. NoIndex Meta Robot tags - If they are not willing to change the page content then you can ask them to prevent those pages getting spidered by adding a noindex meta tags.
Best
Sameer
-
So the best way to get the credit from the article are just the links? is there any special tag? something like meta name=syndication-source? no need?
And yes, you are right its manual syndication and he will keep all the links.
thank you Gianluca
-
Hi...
what you describe is somehow a sort of syndication of your content. A manual one, but still a syndication.
I believe that the guy, when he says he will give you full credit for the content, was meaning an optimized full link to the original article.
If it is so, I would say yes to that guy. If not, ask him to do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Non-standard HTML tags in content
I had coded my website's article content with a non-standard tag <cnt>that surrounded other standard tags that contained the article content, I.e.</cnt> , . The whole text was enclosed in a div that used Schema.org markup to identify the contents of the div as the articleBody. When looking at scraped data for stories in Webmaster Tools, the content of the story was there and identified as the articleBody correctly. It's recently been suggested by someone else that the presence of the non-standard <cnt>tags were actually making the content of the article uncrawlable by the Googlebot, this effectively rendering the content invisible. I did not believe this to be true, since the content appeared to be correctly indexed in Webmaster Tools, but for the sake of a test I agreed to removing them. In the last 6 weeks since they were removed, there have been no changes in impressions or traffic from organic search, which leads me to believe that the removal of the <cnt>tags actually had no effect, since the content was already being indexed successfully and nothing else has changed.</cnt></cnt> My question is whether or not an encapsulating non-standard tag as I've described would actually make the content invisible to Googlebot, or if it should not have made any difference so long as the correct Schema.org markup was in place? Thank you.
Technical SEO | | dlindsey0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
Question about duplicate content within my site
Hi. New here to SEOmoz and also somewhat new to SEO in general. A friend has asked me to help do some onsite SEO for their company's website. The company uses Drupal Content Management System. They have a couple product pages that contain a tabbed section for features, accessories, etc. When they built their tabs, they used a Drupal module called Quicktabs, by which each individual tab is created as a separate page and then pulled into the tabs from those pages. So, in essence, you now have instances of repeated content. 1) the page used to create the tab, and 2) the tab that displays on the product page. My question is, how should I handle the pages that were used to create the tabs? Should I make them NOINDEX? Thank you for your advice in advance.
Technical SEO | | aprilm-1890400