Syndicated content outperforming our hard work!
-
Our company (FindMyAccident) is an accident news site. Our goal is to roll our reporting out to all 50 states; currently, we operate full-time in 7 states.
To date, the largest expenditure is our writing staff. We hire professional
journalists who work with police departments and other sources to develop written
content and video for our site. Our visitors also contribute stories and/or
tips that add to the content on our domain. In short, our content/media is 100% original.A site that often appears alongside us in the SERPs in the markets where we work full-time is accidentin.com. They are a site that syndicates accident news and offers little original content. (They also allow users to submit their own accident stories, and the entries index quickly and are sometimes viewed by hundreds of people in the same day. What's perplexing is that these entries are isolated incidents that have little to no media value, yet they do extremely well.)
(I don't rest my bets with Quantcast figures, but accidentin does use their pixel sourcing and the figures indicate that they are receiving up to 80k visitors a day in some instances.)
I understand that it's common to see news sites syndicate from the AP, etc., and traffic accident news is not going to have a lot of competition (in most instances), but the real shocker is that accidentin will sometimes appear as the first or second result above the original sources???
The question: does anyone have a guess as to what is making it perform so well?
Are they bound to fade away?
While looking at their model, I'm wondering if we're not silly to syndicate news in the states where we don't have actual staff? It would seem we could attract more traffic by setting up syndication in our vacant states.
OR
Is our competitor's site bound to fade away?
Thanks, gang, hope all of you have a great 2013!
Wayne
-
Basically, Google treats Syndicated content and duplicate content differently. So, if the competitor you are talking about is following the best practices for syndicated content and if Google sees their website or webpage to be more prominent (Because of more relevant/ related contents on that domain, SEO optimization or popularity etc.) and more relevant (Than the original creator of the content or the other syndication partners), in relation to the keywords searched for , then Google will show the content on that particular syndication partner's page (in this situation the competitor you are talking about) rather than that of original creator's page.And, no, as long as they are following the best practices for syndicated content, they won't have any problem. But, it could happen that in the future some other content syndication partner might be given more prominence over the other, if that page on that website has leveraged the content better or even the original creator might given more prominence if they do a good job at optimizing their syndicated content strategy.
As far as syndicated content goes, Google says this:
“If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer.”
So, in a nut shell...there are no penalties for properly syndicated content, but, just the fact that Google will decide which page to display based on it's prominence and best practices. But, yeah, if they are not following the best practices for content syndication, then, Google will start to see them as duplicate pages, and, then it is a different story.
BTW, here is a post that will be of help to you which talks about how the original creators of the content can leverage it:
http://www.smashingmagazine.com/2012/06/28/content-creators-benefit-from-new-seo/
-
"The question: does anyone have a guess as to what is making it perform so well?"
Your hard work.
Stop allowing them to use your content and they should not appear in your SERPs.
-
The question: does anyone have a guess as to what is making it perform so well?
You have a stronger link profile but I think they are winning the SERPs because they post "Recent" links on their homepage that link to news and user submissions. This in turn lets crawlers syndicate the latest submissions quicker, their homepage is crawled more often, and they rank quicker/better because of the Query Deserves Freshness (QDF) factor.
I recommend you try doing the same thing and see if that helps you.
--
I also only found 5 instances of your articles being sourced - https://www.google.com/search?q=site:accidentin.com+intext%3Afindmyaccident.com
What kinds of kw are they outranking you for? Do you have a rss feed or how are they scraping you content?
--
In general, scraper sites are not supposed to do well and will probably lose value but I've seen several examples where they are performing really well.
Cheers & Good Luck,
Oleg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Guest post linking only to good content
Hello, We're thinking of doing guest posting of the following type: 1. The only link is in the body of the guest post pointing to our most valuable article. 2. It is not a guest posting site - we approached them to help with content, they don't advertise guest posting. They sometimes use guest posting if it's good content. 3. It is a clean site - clean design, clean anchor text profile, etc. We have 70 linking root domains. We want to use the above tactics to add 30 more links. Is this going to help us on into the future of Google (We're only interested in long term)? Is 30 too many? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Hit hard by Panda 3.3 and Penguin. What to do?
Hi there. I work with a company that was originally all white hat, then began to dabble in some pretty serious black hat activities last year (usually paid linking in private blog networks). At the time we saw tremendous results - many of our most highly competitive keywords shot up 20, 30 positions to the top 10. And they didn't seem to budge so long as we kept those (very expensive) links intact. Alongside all of this, we have had a lot of white hat activity going on (pretty much everything recommended by Google/SEO Moz is ALSO in effect on this domain - lots of consistent/relevant blogging, social media, good content, good on-site SEO, etc), which I attribute to SOME of our success with keyword ranking, but what really made the difference was the paid linking. Let's just say we had two different mindsets behind the SEO strategy of the company, and the "Get rich quick" one worked for a while. Now, it doesn't. (Can you guess if I'm the white hat or the black hat at the company?) So here's my question. I have made the effort to contact all of the webmasters of our egregious links and, as everyone else has described, it is effectively useless. Especially given the amazing post by Ryan Kent on this question (http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links) I have sort of given up on the strategy of contacting these webmasters on a case by case basis and asking for the links to be removed, especially if Google is not going to accept anything less than a perfect backlink portfolio. It is LITERALLY IMPOSSIBLE to clean up these links. Meanwhile, this company is a big name in a very competitive online market and it really needs to see lead generation from organic SEO. (Please don't give me any told-you-so's here, it was out of my hands.) MY QUESTION IS: WHAT SHOULD WE DO? Should we just keep the domain going and focus on only building quailty links from now on? Most of our keywords fall anywhere from position 40 to position 150 right now, so it's not like ALL hope is lost. But as any SEO knows that is basically as good as not being indexed at all. OTHER OPTION: We have an old domain that is the less-SEO-friendly, but it is the official name of our company . com, and this domain is currently 301'd to our live (SEO-friendly) domain. The companyname.com domain is also older than our SEO friendly domain. Should we manually move our site back over to the old domain since there is no penalty on it? It seems like a lot of sites that are ranking are brand new anyway (except their URL's are loaded with keywords.) Blah, I know that was a lot, but I'm feeling lost and ANY insight would be helpful. Thanks as always SEOMoz!!
White Hat / Black Hat SEO | | LilyRay1 -
Why Does Massive Reciprocal Linking Still Work?
It seems pretty well-settled that massive reciprocal linking is not a very effective strategy, and in fact, may even lead to a penatly. However, I still see massive reciprocal linking (blog roll linking even massive resource page linking) still working all the time. I'm not looking to cast aspersion on any individual or company, but I work with legal websites and I see these strategies working almost universally. My question is why is this still working? Is it because most of the reciprocally linking sites are all legally relevant? Has Google just not "gotten around" to the legal sector (doubtful considering the money and volume of online legal segment)? I have posed this question at SEOmoz in the past and it was opined that massively linking blogs through blog rolls probably wouldn't send any flags to Google. So why is that it seems that everywhere I look, this strategy is basically dismissed as a complete waste of time if not harmful? How can there be such a discrepency between what leading SEOs agree to be "bad" and the simple fact that these strategies are working en masse over the period of at least 3 years?
White Hat / Black Hat SEO | | Gyi0