Letting Others Use Our Content: Risk-Free Attribution Methods
-
Hello Moz!
A massive site that you've all heard of is looking to syndicate some of our original editorial content. This content is our bread and butter, and is one of the primary reasons why people use our site.
Note that this site is not a competitor of ours - we're in different verticals.
If this massive site were to use the content straight up, I'm fairly confident that they'd begin to outrank us for related terms pretty quickly due to their monstrous domain authority.
This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours.
They're also not open to including a link back to the product pages where the corresponding reviews live on our site.
Are there other courses of action that could be proposed that would protect our valuable content?
Is there any evidence that using schema.org (Review and Organization schemas) pointing back to our review page URLs would provide attribution and prevent them from outranking us for associated terms?
-
Logan, I found your replies very helpful. We have allowed a site to replicate some of our pages / content on their site and have the rel canonical tag in place pointing back to us. However, Google has indexed the pages on the partner's site as well. Is this common or has something gone wrong? the partner temporarily had an original source tag pointing to their page as well as the canonical pointing to us. We caught this issue a few weeks ago and had the original source tag removed. GSC sees the rel canonical tag for our site. But I am concerned our site could be getting hurt for dupe content issues and the partner site may out rank us as their site is much stronger. Any insight would be greatly appreciated
-
"Why did this offer come my way?"
When someone asks to use your content, that is what you should be asking yourself.
When someone asks to use my content, my answer is always a fast. NO! Even if the Pope is asking, the answer will be NO.
-
This is exactly my concern. Our site is massive in it's own industry, but this other site is a top player across many industries - surely we'd be impacted by such an implementation without some steps taken to confirm attribution.
Thank you for confirming my suspicions.
-
Google claims that they are good at identifying the originator of the content. I know for a fact that they are overrating their ability on this.
Publish an article first on a weak site, allow it to be crawled and remain for six months. Then, put that same article on a powerful site. The powerful site will generally outrank the other site for the primary keywords of the article or the weak site will go into the supplemental results. Others have given me articles with the request that I publish them. After I published them they regretted that they were on my site.
Take pieces of an article from a strong site and republish them verbatim on a large number of weak sites. The traffic to the article on the strong site will often drop because the weak sites outrank it for long-tail keywords. I have multiple articles that were ranking well for valuable keywords. Then hundreds of mashup sites grabbed pieces of the article and published them verbatim. My article tanked in the SERPs. A couple years later the mashups fell from the SERPs and my article moved back up to the first page.
-
But, I would not agree with their site being the one to take the damage. YOU will lose a lot of long-tail keyword traffic because now your words are on their site and their site is powerful.
Typically, the first one that's crawled will be considered the originator of the content--then if a site uses that content it will be the one who is damaged (if that's the case). I was under the impression that your content was indexed first--and the other site will be using your content. At least that's the way I understood it.
So, if your content hasn't already been indexed then you may lose in this.
-
This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours.
Be careful. This is walking past the alligator ambush. I agree with Eric about the rel=canonical. But, I would not agree with their site being the one to take the damage. YOU will lose a lot of long-tail keyword traffic because now your words are on their site and their site is powerful.
They're also not open to linking back to our content.
It these guys walked into my office with their proposal they might not make it to the exit alive.
My only offer would be for them to buy me out completely. That deal would require massive severances for my employees and a great price for me.
-
You're in the driver's seat here. _You _have the content _they _want. If you lay down your requirements and they don't want to play, then don't give them permission to use your content. It's really that simple. You're gaining nothing here with their rules, and they gain a lot. You should both be winning in this situation.
-
Thank you for chiming in Eric!
There pages already rank extraordinarily well. #1 for almost every related term that they have products for, across the board.
They're also not open to linking back to our content.
-
In an ideal situation, the canonical tag is preferred. Since you mentioned that it's not the full content, and you can't implement it, then there may be limited options. We haven't seen any evidence that pointing back to your review page URLs would prevent them from outranking you--but it's not likely. If there are links there, then you'd get some link juice passed on.
Most likely, though, if that content is already indexed on your site then it's going to be seen as duplicate content on their site--and would only really hurt their site, in that those pages may not rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content very similar on different websites
Hello, I am in the travel industry and I am currently building the same website (different domain names), one for the US and one for the UK (same website design). They will both features the same content (itinerary, activities) on the page with 2 exception, the 1 st one is that I will use different hotels for my uk clientele and for my US clientele and on the UK page I will use the word "holiday" in the UK and the word "vacation" in the US. Can the fact that I do the same "itineraries" and use the same text on 95 % of the page hurt my ranking in one country or another ?
Intermediate & Advanced SEO | | seoanalytics0 -
Internal Duplicate Content Question...
We are looking for an internal duplicate content checker that is capable of crawling a site that has over 300,000 pages. We have looked over Moz's duplicate content tool and it seems like it is somewhat limited in how deep it crawls. Are there any suggestions on the best "internal" duplicate content checker that crawls deep in a site?
Intermediate & Advanced SEO | | tdawson091 -
ROI on Policing Scraped Content
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating. My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase. Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
Intermediate & Advanced SEO | | ahirai1 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
How to best handle expired content?
Similar to the eBay situation with "expired" content, what is the best way to approach this? Here are a few examples. With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year? Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available? Thank you!
Intermediate & Advanced SEO | | JDatSB0 -
How do I Syndicating Content for SEO Benefit?
Right now, I am working on one E-Commerce website. I have found same content on that E-Commerce website from manufacturer website. You can visit following pages to know more about it. http://www.vistastores.com/casablanca-sectional-sofa-with-ottoman-ci-1236-moc.html http://www.abbyson.com/room/contemporary/casablanca-detail http://www.vistastores.com/contemporary-coffee-table-in-american-white-oak-with-black-lacquer-element-ft55cfa.html http://www.furnitech.com/ft55cfa.html I don't want to go with Robots.txt, Meta Robots NOINDEX & Canonical tag. Because, There are 5000+ products available on website with duplicate content. So, I am thinking to add Source URL on each product page with Do follow attribute. Do you think? That will help me to save my website from duplicate content penalty? OR How do I Syndicating Content for SEO Benefit?
Intermediate & Advanced SEO | | CommercePundit0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Does google can read the content of one Iframe and use it for the pagerank?
Beginners doubt: When one website has its content inside Iframe's, google will read it and consider for the pagerank?
Intermediate & Advanced SEO | | Naghirniac0