Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I noindex my blog's tag, category, and author pages
-
Hi there,
Is it a good idea to no index tag, category, and author pages on blogs?
The tag pages sometimes have duplicate content. And the category and author pages aren't really optimized for any search term.
Just curious what others think.
Thanks!
-
Noindex tags and your author page (if you only have 1 author)
For categories, leave it alone, just add a bit of unique content there, much like how ecommerce sites do it.
-
Thanks everyone so much for your fast and helpful feedback!
-
Hi there! In addition to the responses above, I'd also recommend checking out the Yoast plugin. On a brand new blog, I recommend noindexing the category pages until a good amount of content is built up in each category, including unique content on each category landing page. You may find this post by Dan Shure helpful: http://moz.com/blog/setup-wordpress-for-seo-success. Best of luck!
Christy
-
I believe this depends from case to case. On my own blog I have disabled tags but not categories and this is because I believe few category URL are very much user friendly.
Before you actually implement anything, ask yourself if this will hurt user experience or not and If the answer is no then just go ahead and do that!
Hope this helps!
-
I would, personally I have disabled my tag page and author page since there are no tags, and only one author. As for the categories I have them no-indexed.
-
Yes, actually John Muller at Google said this is a wise thing to do.
First ask yourself is there really any benefit in your users going to any of those pages?
Duplicate content is not a major issue and can be wildly ignored but many blogs go over the top with tags etc.. Making Google's life harder when trying to make a decision on what content to show for results.
I personally went as far as removing the front end of wordpress completely and coded a very quick and simple version of the front end myself, so it still gets managed using the wordpress interface like a content management system.
Works a treat and the response from Google SERPS was great. Also reduces the amount of pages Google needs to crawl over and over to look for changes. This will see Google returning more frequently to the key pages looking for updates.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I change Tags and Categories in Wordpress blog post, will it negatively affect SEO and cause 404s?
Hi, I have belatedly come to the conclusion that I have been using tags and categories when blogging in wordpress incorrectly. The result is that Google seems to prefer to show my archives and tags in search results rather than the post itself. Not good UX. As the site is only a few months old, am I best to learn my lesson and tag and categorize correctly moving forward or Should I go back in to these posts and clean them up & categorize and tag correctly. If I do this, will it cause 404s and hurt my SEO? Thanks!
Technical SEO | | johnyfiveisalive2 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Soft 404's on a 301 Redirect...Why?
So we launched a site about a month ago. Our old site had an extensive library of health content that went away with the relaunch. We redirected this entire section of the site to the new education materials, but we've yet to see this reflected in the index or in GWT. In fact, we're getting close to 500 soft 404's in GWT. Our development team confirmed for me that the 301 redirect is configured correctly. Is it just a waiting game at this point or is there something I might be missing? Any help is appreciated. Thanks!
Technical SEO | | MJTrevens0 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Why are pages still showing in SERPs, despite being NOINDEXed for months?
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated.
Technical SEO | | MadeLoud0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0