Blog Duplicate Content
-
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs.
Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice?
Thanks Mozzers! Luke
-
When you say "duplicate blog content", do you mean that the paths are creating duplicate URLs for the post themselves, or for that the tags, categories, etc. have overlapping search results?
If it's the latter (more common), I'd generally NOINDEX those. They aren't really duplicates, and canonical isn't appropriate in most of those cases. Agreed with @Boomajoom, though, it does depend on how integral those paths are for search. Some people use tags as major category navigation and build links to them. For others, the tags are just secondary navigation.
If your paths are all creating different URLs for the individual blog post, definitely use rel-canonical on those. That's a problem, and they are true duplicates. I'd even advise seeing if you can not do that - it's just messy long-term, and the perceived usability benefits are very, very small in my experience. It's almost never worth having multiple URLs for the same final page.
-
It depends. If you're getting traffic and links to your tag pages, then it wouldn't be worth it to de-index them. The general rule of thumb that I've followed is if I'm attracting traffic to tags or categories, I leave them in the index while adding canonical tags to individual post pages. But other people have other views and a lot of it is personal preference to how important those other pages are for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog with all copied content, should it be rewritten?
Hi, I am auditing a blog where their goal is to get approved to on ad networks but the whole blog has copied content from different sources, so no ad network is approving them. Surprisingly (at least to me), is that the blog ranks really well for a few keywords (#1's and rich snippets ), has a few hundred of natural backlinks, DA is high, has never been penalized (they have always used canonical tags to the original content), traffic is a few thousand sessions a month with mostly 85% organic search, etc. overall Google likes it enough to show them high on search. So now the owner wants to monetize it. I suggested that the best approach was to rewrite their most visited articles and deleted the rest with 301 redirects to the posts that stay. But I actually haven't worked on a similar project before and can't find precise information online so I'm looking to know if anyone has a similar experience to this. A few of my questions are: If they rewrite most of the pages and delete the rest so there is no repeated/copied content, would ad networks (eg. adsense) approve them? Assuming the new articles are at least as good quality as the current ones but with original content, is there a risk on losing DA? since pretty much it will look like a new site once they are done They have thousands of articles but only about 200 hundred get most visits, which would be the ones getting rewritten, so it should be fine to redirect the deleted ones to the remaining? Thanks for any suggestions and/or tips on this 🙂
Intermediate & Advanced SEO | | ArturoES0 -
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers, I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now. The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though: they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions) indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc) a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case) they have Analytics but don't use Webmaster Tools Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them. Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements? Thanks for your help!
Intermediate & Advanced SEO | | Guybrush_Threepw00d0 -
Magento products and eBay - duplicate content risk?
Hi, We are selling about 1000 sticker products in our online store and would like to expand a large part of our products lineup to eBay as well. There are pretty good modules for this as I've heard. I'm just wondering if there will be duplicate content problems if I sync the products between Magento and eBay and they get uploaded to eBay with identical titles, descriptions and images? What's the workaround in this case? Thanks!
Intermediate & Advanced SEO | | speedbird12290 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
3rd Party hosted whitepapers — bad idea? Duplicate content?
It is common the B2B world to have 3rd parties host your whitepapers for added exposure. Is this a bad practice from an SEO point of view? Is the expectation that the 3rd parties use rel=canonical tags? I doubt most of them do . . .
Intermediate & Advanced SEO | | BlueLinkERP0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
Duplicate Content - Panda Question
Question: Will duplicate informational content at the bottom of indexed pages violate the panda update? **Total Page Ratio: ** 1/50 of total pages will have duplicate content at the bottom off the page. For example...on 20 pages in 50 different instances there would be common information on the bottom of a page. (On a total of 1000 pages). Basically I just wanted to add informational data to help clients get a broader perspective on making a decision regarding "specific and unique" information that will be at the top of the page. Content ratio per page? : What percentage of duplicate content is allowed per page before you are dinged or penalized. Thank you, Utah Tiger
Intermediate & Advanced SEO | | Boodreaux0 -
Help With Preferred Domain Settings, 301 and Duplicate Content
I've seen some good threads developed on this topic in the Q&A archives, but feel this topic deserves a fresh perspective as many of the discussion were almost 4 years old. My webmaster tools preferred domain setting is currently non www. I didn't set the preferred domain this way, it was like this when I first started using WM tools. However, I have built the majority of my links with the www, which I've always viewed as part of the web address. When I put my site into an SEO Moz campaign it recognized the www version as a subdomain which I thought was strange, but now I realize it's due to the www vs. non www preferred domain distinction. A look at site:mysite.com shows that Google is indexing both the www and non www version of the site. My site appears healthy in terms of traffic, but my sense is that a few technical SEO items are holding me back from a breakthrough. QUESTION to the SEOmoz community: What the hell should I do? Change the preferred domain settings? 301 redirect from non www domain to the www domain? Google suggests this: "Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer." Any insight would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC1