Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
-
Greetings Moz Community:
I purchased a SEMrush subscription recently and used it to run a site audit.
The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly.
My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors.
So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit.
Thanks, Alan
-
Thanks for cleaning that up, Dennis. That is great advice.
-
I encounter sometimes that with my clients. The basic thing to do is just to add a canonical since they are already noindexed especially for themes that utilize certain pages within a page. Crazy sounding but some themes actually does this so you can't remove the duplicate page, so noindexing it then adding a canonical is already good enough.
But since you mentioned these are just tags, then simply noindexing them is fine. (I'm assuming these are just basic wordpress tags)
As for your pagination question, use a canonical to link to a URL where all the posts are shown. That's the basic rule for that situation and it's somewhere in Google guidelines about pagination
-
Hi Reserve:
Thanks for your response.
Google is able to view this content because of links that go to and from it? So I am not protected by the no-index tag?
I am very unfamiliar with the strange tags generated by Wordpress. Do you think that such tags as the following can be removed without any detrimental effect? If the URLS for these tags are removed should there be redirects added? http://www.nyc-officespace-leader.com/blog/tag/boutique-space, http://www.nyc-officespace-leader.com/blog/tag/meatpacking-district, http://www.nyc-officespace-leader.com/blog/tag/restaurant-space, http://www.nyc-officespace-leader.com/blog/tag/retail-space, http://www.nyc-officespace-leader.com/blog/tag/store-space, http://www.nyc-officespace-leader.com/blog/tag/the-plaza-district, http://www.nyc-officespace-leader.com/blog/tag/times-square, http://www.nyc-officespace-leader.com/blog/tag/chelsea, http://www.nyc-officespace-leader.com/blog/tag/upper-east-side, http://www.nyc-officespace-leader.com/blog/tag/upper-west-side
Also, should canonical tags be added to blog URLs even if they are set to no-index? For example:
http://www.nyc-officespace-leader.com/blog/page/2
http://www.nyc-officespace-leader.com/blog/page/3
http://www.nyc-officespace-leader.com/blog/page/4
Thanks, Alan
-
I would remove them, to be safe. Google sees them regardless of the "no-index", and I think that the cleaner you can get your data, the better off you will be in the long run. While there may be no harm at this time, things always change. I know one thing for sure, and that is that you don't want a duplicate content issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Print Button Creating Duplicate PDF URLs set to NoIndex, OK for SEO?
Our real estate website has 400 listings. We have added a button that allows the visitor to print listing pages in the for.m of a PDF. The PDF exists as a URL ending in ?print=17076. This print URL is set to noindex and follow. So our site has 400 additional URLs. Is this a negative for SEO? Or neutral? I have read it using CSS it is possible to set up printing without creating all these extra URLs. Is this method better from an SEO perspective? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
Dealing with close content - duplicate issue for closed products
Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.
Intermediate & Advanced SEO | | AymanH0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Pagination: rel="next" rel="prev" in ?
With Google releasing that instructional on proper pagination I finally hunkered down and put in a site change request. I wanted the rel="next" and rel="prev" implemented… and it took two weeks for the guy to get it done. Brutal and painful. When I looked at the source it turned out he put it in the body above the pagination links… which is not what I wanted. I wanted them in the . Before I respond to get it properly implemented I want a few opinions - is it okay to have the rel="next" in the body? Or is it pretty much mandatory to put it in the head? (Normally, if I had full control over this site, I would just do it myself in 2 minutes… unfortunately I don't have that luxury with this site)
Intermediate & Advanced SEO | | BeTheBoss1 -
How to keep the link juice in E-commerce to an "out of stock" products URL?
I am running an e-commerce business where I sell fashion jewelry. We usually have 500 products to offer and some of them we have only one in stock. What happens is that many of our back links are pointed directly to a specific product, and when a product is sold out and no longer is in stock the URL becomes inactive, and we lose the link juice. What is the best practice or tool to 301-redirect many URLs at the same time without going and changing one URL at a time? Do you have any other suggestions on how to manage an out of stock product but still maintain the link juice from the back link? Thanks!
Intermediate & Advanced SEO | | ikomorin0 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0