What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
-
Cookstr appears to be syndicating content to shape.com and mensfitness.com
a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look.
b) they link the image back to their image hosted on cookstr
c) The page does not have microformats or as much data as their own page does so their own page is better SEO.
Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance!
their original content page:
http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta
their syndicated content pages:
http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta -
thanks for taking the time, excellent response.
what we are hoping to do is to provide a SEO lite version of our content out for syndication that has the ability to rank but not out rank us.
the partners we are speaking with only want to do business with us if these pages can be indexed in Google and have the chance of ranking. cookstr seems to be accomplishing this with stronger SEO on their page and weaker syndicated content on partner sites ensuring that cookstr comes up first in searches.
-
So you want to create duplicate content on the web?
If you're syndicating content, the reason you're doing it is to provide this content to the readership of these other sites. If someone searches, and discovers this content in the search engines, you want the attribution of that content to come from the original source, you guys. Why would you want some other website to be discovered as a result of content you created and syndicated to them? If you want the credit to go to them, and the engines to bring visitors to their site for content you created, that's why you should write a guest post or column and put it on their site. But if you're syndicating content and want it to show up on other sites, I think it makes sense to actually have the canonical tag pointing to your site, indicating to the search engines that you are the authentic source of that piece of content. The other site's readers will still enjoy the content on that site, you'll just be credited for it in the search engines. To me, this is a win win situation where everyone benefits. The other site benefits because it gets free content to give to their readership, and you get attribution for it in the engines and the additional exposure to the other sites visitors.
-
but that would result in the partner site not being indexed i believe, we want the partner sites to rank or at be indexed and have the possibility of ranking. if we were to add the canonical to our site we might as well just noindex their pages i think
-
One thing I would add is see if you can get the sites taking the content to implement a cross domain canonical tag. Meaning, have them point to your original page of the content in the canonical tag. This will indicate to the search engines that they are attributing the content to you. This should serve as the clearest signal to the engines who to attribute the content to - it's great that there is attribution in the text itself, but the best method of giving the correct attribution would be with the canonical tag.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Consolidating two different domains to point at same site, duplicate content penalty?
I have two websites that are extremely similar and want to consolidate them into one website by pointing both domain names at one website. is this going to cause any duplicate content penalties by having two different domain names pointing at the same site? Both domains get traffic so i don't want to just discontinue one of the domains.
Intermediate & Advanced SEO | | Ron100 -
Help with Best Content Posting Approach - WordPress site
I have a word document that i would like to add to my wordpress site as a page. The document has a large detailed flow chart of a complex legal process. (about 20+ boxes in the flow chart). I do not want to add it as an image because i want search engines to read/index the information in the flow chart. any suggestions to post this detailed flow chart on a WP page in the best SEO manner? Thanks.
Intermediate & Advanced SEO | | CamiloSC0 -
Is all duplication of HTML title content bad?
In light of Hummingbird and that HTML titles are the main selling point in SERPs, is my approach to keyword rich HTML titles bad? Where possible I try to include the top key phrase to descripe a page and then a second top keyphrase describing what the company/ site as a whole is or does. For instance an estate agents site could consist of HTML title such as this Buy Commercial Property in Birmingham| Commercial Estate Agents Birmingham Commercial Property Tips | Commercial Estate Agents In order to preserve valuable characters I have also been omitting brand names other than on the home page... is this also poor form?
Intermediate & Advanced SEO | | SoundinTheory0 -
Frequent FAQs vs duplicate content
It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site. Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.
Intermediate & Advanced SEO | | boxcarpress0 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1