Syndicated content outranks my original article
-
I have a small site and write original blog content for my small audience.
There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there.
When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin.
I wait a day. By this time G has seen the links that point to my article and has indexed it.
Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out.
So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper?
Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate).
There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
-
Thanks, Tommy. That confirms what I thought. I wouldn't mind so much if the bigger site didn't nofollow my author tag but since they do then I'm getting little benefit from them other than exposure to their audience. And that is worth something, to be sure.
Maybe I'll post on their site for a day or two and then delete the post on their site (I have that ability) so that I get some exposure there but then the only copy of the article will be on my site after a couple of days.
-
Thanks, Egol. For my next few postings I will keep them on my own site and see what kind of rankings and traffic they get for a month or so. Then compare that traffic to the traffic I've seen from articles I've posted on the larger site.
Appreciate the input. I do want to build equity for my own site, but it's a trade off with getting more exposure/customers on the bigger site. I am in this for the long haul, though, so I suppose tons of unique content on my own site will be valuable in the future.
-
Hi Mike,
I also had a similar experience and debated for a while to finally come up with a solution.
If you are posting exactly the same content on your blog and on another blog, I belive that is already causing duplicate content even if you posted on your blog first. How duplicate content works is that if someone search for your article title (like what you did), Google will pull up websites that best match the search. If G sees that the bigger site has the exact same article as your blog, they will use the bigger site in the result because 1) it probably has more backlinks 2) it probably has more authority and 3) it's domain age is probably older than your blog.
One way to solve this is to use canonical tag but it seems like it doesn't work because they remove it.
Here is where you will have to debate and decide what works better for you.
-
Don't repost on the bigger site so that you article can actually be found via Search Engine instead of the bigger site. However, with this approach, you will lose those article views from the bigger site and you will lose the opportunity of reaching viewers that will never visit your site,
-
Continue to post on the bigger site so that you will have more views on your articles, you can reach those people who you might not be able to reach if you only post on blog, increase your audience and get your name out. However, with this approach, your website won't appear on search result since the bigger site obviously hav emore authority than your site and your site might get penalized for duplicate content. Well, you can stop posting the article on your blog and just post on the bigger site to avoid duplicate content.
You will have to decide which scenario benefits you more.
OR
- Post on your website but also create NEW and UNIQUE articles on the bigger site to increase view, hopefully traffic and etc.
To answer your questions
-
Yes, no snippet because the author tag is probably noindex.
-
My explantion on how duplicate works probably answered the question
Hope this helps.
-
-
I have experience republishing lots of content from universities and government agencies.
Their content on my site often outranks the same content on their site. It does not matter who publishes first. The content that I republish was on these other sites for weeks and months - sometimes years - in most cases before I republished. What matters is which domain google favors for that topic.
I get lots of links and traffic using their content.
As you get more and more duplicate content out there on other websites you increase your risk of getting hit with a Panda problem. For that reason, I have cut back on the amount of republishing that I do.
I never give my articles to other websites for republishing. In my opinion that feeds your competitors and creates new ones.
The only way that I would give one of my articles to another site is if that site has ENORMOUS traffic in comparison to mine and my goal is to "get the word out" about something. If you are republishing on other sites because you think you will get a link or a bit of temporary traffic, I believe that is a mistake and you would be better off building unique equity for your own site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Sitemap For Static Content And Blog
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently. How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed. I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
Technical SEO | | vikasnwu0 -
Duplicate Content Brainstorming
Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.
Technical SEO | | slidescamp0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
RSS Feed - Dupe Content?
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something. The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe? They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help. Thanks in advance for any responses!
Technical SEO | | marcoose810 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0