Syndicated content outranks my original article
-
I have a small site and write original blog content for my small audience.
There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there.
When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin.
I wait a day. By this time G has seen the links that point to my article and has indexed it.
Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out.
So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper?
Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate).
There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
-
Thanks, Tommy. That confirms what I thought. I wouldn't mind so much if the bigger site didn't nofollow my author tag but since they do then I'm getting little benefit from them other than exposure to their audience. And that is worth something, to be sure.
Maybe I'll post on their site for a day or two and then delete the post on their site (I have that ability) so that I get some exposure there but then the only copy of the article will be on my site after a couple of days.
-
Thanks, Egol. For my next few postings I will keep them on my own site and see what kind of rankings and traffic they get for a month or so. Then compare that traffic to the traffic I've seen from articles I've posted on the larger site.
Appreciate the input. I do want to build equity for my own site, but it's a trade off with getting more exposure/customers on the bigger site. I am in this for the long haul, though, so I suppose tons of unique content on my own site will be valuable in the future.
-
Hi Mike,
I also had a similar experience and debated for a while to finally come up with a solution.
If you are posting exactly the same content on your blog and on another blog, I belive that is already causing duplicate content even if you posted on your blog first. How duplicate content works is that if someone search for your article title (like what you did), Google will pull up websites that best match the search. If G sees that the bigger site has the exact same article as your blog, they will use the bigger site in the result because 1) it probably has more backlinks 2) it probably has more authority and 3) it's domain age is probably older than your blog.
One way to solve this is to use canonical tag but it seems like it doesn't work because they remove it.
Here is where you will have to debate and decide what works better for you.
-
Don't repost on the bigger site so that you article can actually be found via Search Engine instead of the bigger site. However, with this approach, you will lose those article views from the bigger site and you will lose the opportunity of reaching viewers that will never visit your site,
-
Continue to post on the bigger site so that you will have more views on your articles, you can reach those people who you might not be able to reach if you only post on blog, increase your audience and get your name out. However, with this approach, your website won't appear on search result since the bigger site obviously hav emore authority than your site and your site might get penalized for duplicate content. Well, you can stop posting the article on your blog and just post on the bigger site to avoid duplicate content.
You will have to decide which scenario benefits you more.
OR
- Post on your website but also create NEW and UNIQUE articles on the bigger site to increase view, hopefully traffic and etc.
To answer your questions
-
Yes, no snippet because the author tag is probably noindex.
-
My explantion on how duplicate works probably answered the question
Hope this helps.
-
-
I have experience republishing lots of content from universities and government agencies.
Their content on my site often outranks the same content on their site. It does not matter who publishes first. The content that I republish was on these other sites for weeks and months - sometimes years - in most cases before I republished. What matters is which domain google favors for that topic.
I get lots of links and traffic using their content.
As you get more and more duplicate content out there on other websites you increase your risk of getting hit with a Panda problem. For that reason, I have cut back on the amount of republishing that I do.
I never give my articles to other websites for republishing. In my opinion that feeds your competitors and creates new ones.
The only way that I would give one of my articles to another site is if that site has ENORMOUS traffic in comparison to mine and my goal is to "get the word out" about something. If you are republishing on other sites because you think you will get a link or a bit of temporary traffic, I believe that is a mistake and you would be better off building unique equity for your own site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | | KarlBantleman0 -
Magento Duplicate Content help!
How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?
Technical SEO | | adamxj20 -
Content on subdomain...
We recently moved our Wordpress site to a new host (WPEngine). We had forums on the old web host, which we need to migrate to a new forum platform (Xenforo) and integrate into the WP site. Since WPEngine only allows Wordpress on their servers, we need to install the forums at another web host, on one of our other domains. We might point to the forums through a subdomain, like this: forums.our-primary-domain.com The main reason we're re-installing the forums is for SEO value. HOWEVER, since our forum content will be on another domain, will we have an issue? If so, is there a workaround that would give us 'credit' for that content? Thanks much.
Technical SEO | | jmueller08230 -
Would this be considered "thin content?"
I share a lot of images via twitter and over the last year I've used several different tools to do this; mainly twitpic, and now instagram. Last year I wanted to try to find a way to host those images on my site so I could get the viewers of the picture back to my site instead a 3rd party (twitpic, etc.) I found a few plugins that worked "sort of" well, and so I used that for a while. (I have since stopped doing that in favor of using instagram.) But my question is do all of these image posts hurt my site you think? I had all of these images under a category called "twitter" but have since moved them to an uncategorized category until I figure out what I want to do with them. I wanted to see if anyone could chime in and give me some advice. Since the posts are just images with no content (other than the image) and the title isn't really "optimized" for anything do these posts do me more harm than good. Do I delete them all? Leave them as is? Or do something else? Also in hindsight I'm assuming this was a bad idea since the bounce rate for people clicking on a link just to see an image was probably very high, and may have caused the opposite result of what I was looking for. If I knew than what I know now I would have tracked the bounce rate of those links, how many people who viewed one of those images actually went to another page on the site, etc. But hindsight's 20/20. 🙂
Technical SEO | | NoahsDad0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0