Duplicate Content - That Old Chestnut!!!
-
Hi Guys,
Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links.
1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links.
2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea?
A million thanks for any guidance.
Kind Regards,
C
-
Hi Craig,
Search this in Google "keyword"+“Guest bloggers wanted” OR “guest blogger wanted”
Then analyse the authority of each site, and choose the best 10 to submit your articles to.
You could also join the guest posting community at myblogguest.com
Good luck!
Greg
-
Thanks Greg,
If you could, where should I look for the best guest post sites? I have heard that some can be good and some can be bad. Now whilst or views and writing is fairly good we are no Technoratiiums......Your thoughts on this would be greatly appreciated?
Thanks,
C
-
Do both.
Google wants to see fresh content on you site, but then you also need contextual backlinks from other sites.
Do 10 guest posts, and publish 10 on your website.
Greg
-
Hi Alan,
Thanks for that, makes a lot of sense. So all that said and done. Would I be better, putting 15-20 articles per month only on our own blog on our website or should I also post some different articles as guest posts?
Nae Easy.
Thanks,
Craig
-
Google seems to have a number of ways to treat duplicate content of the type you are suggesting.
You are almost suggesting syndicated content. This is closely related to a press release, which is issued with one press releases distributor and then published on multiple other sites.
A little investigation shows that google handles such duplicate content pages differently, depending on which way the wind is blowing today.
If the site that posts the secondary copies is a powerful site or a site that google likes, then there seems to be no problem with duplicate content. To check this out, find an Associated Press story , then search for it at google and you will find it on dozens, or even hundreds of newspaper and radio station websites. Most of them will have exactly the same headline and exactly the same content.
Any site that posts the secondary copies, that google doesn't like, for example, covered by panda or penguin or a manual penalty, the content will not be found in search results, unless you go to the end of results and redo the search with the duplicates shown.
If the original site is weak or covered by panda or penguin, then the secondary copies may display in search results, but your original may not. It is also possible that your index pages that contain a snippet of the original may display in the results, but not the complete original, if other, more powerful sites are displaying it.
So, if my observations are valid, it means that you should keep the original to yourself and not syndicate it. If you want to get value from other sites, then write a different story for them (guest post, as DiscoverAfrica suggests) and get a link back to your site within the body of the story.
-
Hi Craig,
I would suggest looking for guest posting opportunities with webmasters in your niche (or similar niche) rather than publishing on Article/Blog directories.
1.) Always only publish your articles in one place, don't try to mass submit the same article to many websites.
2.) You can publish the article on your website, and then syndicate the article on other websites, but this isnt the best idea either. If you decide to guest post, the webmasters usually check to see if the content is original, if not they wont accept it. Even if they do accept it, the link pointing to the original article on your website is merely a "reference" rather than an endorsement from one site to the next.
Hope this makes sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT Duplicate Content and Canonical Tag - Annoying
Hello everyone! I run an e-commerce site and I had some problems with duplicate meta descriptions for product pages. I implemented the rel=canonical in order to address this problem, but after more than a week the number of errors showing in google webmaster tools hasn't changed and the site has been crawled already three times since I put the rel canonical. I didn't change any description as each error regards a set of pages that are identical, same products, same descriptions just different length/colour. I am pretty sure the rel=canonical has been implemented correctly so I can't understand why I still have these errors coming up. Any suggestions? Cheers
Technical SEO | | PremioOscar0 -
Duplicate content in product listing
We have "duplicate content" warning in our moz report which mostly revolve around our product listing (eCommerce site) where various filters return 0 results (and hence show the same content on the page). Do you think those need to be addressed, and if so how would you prevent product listing filters that appearing as duplicate content pages? should we use rel=canonical or actually change the content on the page?
Technical SEO | | erangalp0 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
Avoiding duplicate content on internal pages
Lets say I'm working on a decorators website and they offer a list of residential and commercial services, some of which fall into both categories. For example "Internal Decorating" would have a page under both Residential and Commercial, and probably even a 3rd general category of Services too. The content inside the multiple instances of a given page (i.e. Internal Decorating) at best is going to be very similar if not identical in some instances. I'm just a bit concerned that having 3 "Internal Decorating" pages could be detrimental to the website's overall SEO?
Technical SEO | | jasonwdexter0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
Omniture tracking code URLs creating duplicate content
My ecommerce company uses Omniture tracking codes for a variety of different tracking parameters, from promotional emails to third party comparison shopping engines. All of these tracking codes create URLs that look like www.domain.com/?s_cid=(tracking parameter), which are identical to the original page and these dynamic tracking pages are being indexed. The cached version is still the original page. For now, the duplicate versions do not appear to be affecting rankings, but as we ramp up with holiday sales, promotions, adding more CSEs, etc, there will be more and more tracking URLs that could potentially hurt us. What is the best solution for this problem? If we use robots.txt to block the ?s_cid versions, it may affect our listings on CSEs, as the bots will try to crawl the link to find product info/pricing but will be denied. Is this correct? Or, do CSEs generally use other methods for gathering and verifying product information? So far the most comprehensive solution I can think of would be to add a rel=canonical tag to every unique static URL on our site, which should solve the duplicate content issues, but we have thousands of pages and this would take an eternity (unless someone knows a good way to do this automagically, I’m not a programmer so maybe there’s a way that I don’t know). Any help/advice/suggestions will be appreciated. If you have any solutions, please explain why your solution would work to help me understand on a deeper level in case something like this comes up again in the future. Thanks!
Technical SEO | | BrianCC0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0