Am I Syndicating Content Correctly?
-
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
-
No, you will not receive any increase in your pagerank as a result.
Having said that, if the other website did NOT include the canonical link then there is a chance the link juice for the page would either be split equally between your site and their site or worse case it will all be given to their site (if Google thinks that they are the originator)! So indirectly, ensuring that they add the canonical tag will result in your page having a better ranking.
Hope that makes sense!
Steve
-
Thanks for taking the time to answer my questions. I do have a follow up though... With the "canonical" and "no index, follow" tags in place, will any link juice be transferred?
For example:
Original article is published on www.mysite.com/original-article
Content is syndicated on www.theresite.com/syndicated-content with the following tags in place:
What I am getting confused about is since the syndicated content is not getting index, then does any sort of link attributes get passed through to my original article? In other words, does the canonical link pass any link juice even though the noindex tag is in place?
-
However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article.
Yes, but you gotta be really careful. If you fill syndicated content with anchor text links you will have a Penguin problem.
** Wondering if this was written before Penguin. ** If I was the boss at Google we would have a bar of soap used to wash the mouth of Googlers who talk about link building.
-
**Our initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. **
This is the way to go. But, you must require them to use the canonical and the no index. You gotta say, "These are our conditions for your use of our content." If they are good guys then they should have no problem with it. Stick to your guns about this.
My bet is that some will simply rewrite your content.
-
Hi,
I would stipulate that anyone wishing to re-using your content does so on the condition that they include a canonical link back to your original article... Even if a few people do this then Google will soon realise that you are the author of the original article and credit you with the associated pagerank.
You should never look to create content solely for search engines (so you're doing the right thing). Website content should always be about your users but if you do this correctly then you will also benefit from the traffic the search engines generate!
Hope this helps.
Steve
-
Hi Brad,
Google's official version below:
- Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
You can refer to it on this link
Cheers,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
Hi all, We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
How do you optimize a page with Syndicated Content?
Content is syndicated legally (licensed). My questions are: What is the best way to approach this situation? Is there any a change to compete with the original site/page for the same keywords? Is it okay to do so? Will there be any negative SEO impact on my site?
White Hat / Black Hat SEO | | StickyRiceSEO0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Difference between Syndication, Autoblogging, and Article Marketing
Rands slide deck titled 10 Steps to Effective SEO & Rankings from InfusionCon2011 on slide 82 recommends content syndication as a method for building traffic and links. How is this any different than article marketing? He gave an example of this using a screenshot of this search result for "headsmacking tip discussion." All of those sites that have republished SEOmoz's content are essentially autoblogs that post ONLY content generated by other people for the purpose of generating ad clicks from their organic traffic. We know that Google has clearly taken a position against these types of sites that offer no value. We hear Matt Cutts say to stay away from article marketing because you're just creating lots of duplicate content. Seems to me that "syndication" is just another form of article marketing that spreads duplicate content throughout the web. Can someone help me understand the difference? By the way, the most interesting one I saw in those results was the syndicated article on businessweek.com!.
White Hat / Black Hat SEO | | summitseo0