Should I noindex my blog's tag, category, and author pages
-
Hi there,
Is it a good idea to no index tag, category, and author pages on blogs?
The tag pages sometimes have duplicate content. And the category and author pages aren't really optimized for any search term.
Just curious what others think.
Thanks!
-
Noindex tags and your author page (if you only have 1 author)
For categories, leave it alone, just add a bit of unique content there, much like how ecommerce sites do it.
-
Thanks everyone so much for your fast and helpful feedback!
-
Hi there! In addition to the responses above, I'd also recommend checking out the Yoast plugin. On a brand new blog, I recommend noindexing the category pages until a good amount of content is built up in each category, including unique content on each category landing page. You may find this post by Dan Shure helpful: http://moz.com/blog/setup-wordpress-for-seo-success. Best of luck!
Christy
-
I believe this depends from case to case. On my own blog I have disabled tags but not categories and this is because I believe few category URL are very much user friendly.
Before you actually implement anything, ask yourself if this will hurt user experience or not and If the answer is no then just go ahead and do that!
Hope this helps!
-
I would, personally I have disabled my tag page and author page since there are no tags, and only one author. As for the categories I have them no-indexed.
-
Yes, actually John Muller at Google said this is a wise thing to do.
First ask yourself is there really any benefit in your users going to any of those pages?
Duplicate content is not a major issue and can be wildly ignored but many blogs go over the top with tags etc.. Making Google's life harder when trying to make a decision on what content to show for results.
I personally went as far as removing the front end of wordpress completely and coded a very quick and simple version of the front end myself, so it still gets managed using the wordpress interface like a content management system.
Works a treat and the response from Google SERPS was great. Also reduces the amount of pages Google needs to crawl over and over to look for changes. This will see Google returning more frequently to the key pages looking for updates.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to safely syndicate blog posts in LinkedIn Company page
We currently post our new blog post on our company profile (in Linkedin) immediately after publishing on our company blog. Is this running the risk of duplicate content? My colleague pastes the new posts URL into (I've given an example using my own site - ironically about Panda)! Does this run the risk of creating duplicate content? I've tried searching for indexed pages in Google for past posts and only find the originals (no URLs for the LinkedIn domain). I've got a bit confused about the whole subject having read Neil Patel's excellent article on "...syndicating content without screwing up you seo"! Thank you 🙂 10FpxLP 10FpxLP
Technical SEO | | Catherine_Selectaglaze1 -
I have a Category and Tag In My Blogs
I have use category and Tags in my blogs. Now i have an problem with blog URL and Tags URL. My blog URLs is also show in Tags page and both the content is same. For Example: My Blog URL is: https://www.example.com/advice-how-to-do-batting And Tag Page URL is : https://www.example.com/advice-batting in that - https://www.example.com/advice-how-to-do-batting The URLs contain same content. No should i write two different meta title and description for above two URLs pages. As there might more blog added under Tags pages with different topics and title. Request on Thought Please.
Technical SEO | | ProcessSEO0 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
Building URL's is there a difference between = and - ?
I have a Product Based Search site where the URL's are built dynamically based on the User input Parameters Currently I use the '=' t o built the URL based on the search parameters for eg: /condition=New/keywords=Ford+Focus/category=Exterior etc Is there any value in using hypen's instead of = ? Could you please help me in any general guidelines to follow
Technical SEO | | Chaits0 -
What to do when you want the category page and landing page to be the same thing?
I'm working on structuring some of my content better and I have a dilemma. I'm using wordpress and I have a main category called "Therapy." Under therapy I want to have a few sub categories such as "physical therapy" "speech therapy" "occupational therapy" to separate the content. The url would end up being mysite/speech-therapy. However, those are also phrases I want to create a landing page for. So I'd like to have a page like mysite.com/speech-therapy that I could optimize and help people looking for those terms find some of the most helpful content on our site for those certain words. I know I can't have 2 urls that are the same, but I'm hoping someone can give me some feedback on the best way to about this. Thanks.
Technical SEO | | NoahsDad0 -
Schema Tags Configuration - Ecommerce Category Pages
I'm semi confident that some schema tags are implemented correctly on our ecommerce category pages.. but I would just like to double check. An example url http://www.freshcargo.co.uk/shoes I have just fixed some errors using the Google rich snippets tool... but the thing I'm not sure about is why the prices are being displayed as seperate items Eg: http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.freshcargo.co.uk%2Fshoes&view= Thanks in advance
Technical SEO | | edwardlewis0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0