Should I set blog category/tag pages as "noindex"? If so, how do I prevent "meta noindex" Moz crawl errors for those pages?
-
From what I can tell, SEO experts recommend setting blog category and tag pages (ie. "http://site.com/blog/tag/some-product") as "noindex, follow" in order to keep the page quality of indexable pages high. However, I just received a slew of critical crawl warnings from Moz for having these pages set to "noindex." Should the pages be indexed? If not, why am I receiving critical crawl warnings from Moz and how do I prevent this?
-
In the situation outline by the OP, these pages are noindexed. There’s no value to clutterig up crawl reports on these pages. Block rogerbot from non-critical parts of your site, unless you want to be alerted of issues, then don’t.
-
Thanks, I'm not concerned about the crawl depth of the search engine bots, there is nothing in your fix that would affect that, I'm curious of the decrease in crawl depth of the site with the Moz as we use that to spot issues with the site.
One of the clients I implemented the fix on went from 4.6K crawled pages to 3.4K and the fix would have removed an expected 1.2K pages.
The other client went from 5K to 3.7K and the fix would have removed an expected 1.3K pages.
TL;DR - Good News everybody, the robots.txt fix didn't reduce the crawl depth of the moz crawler!
-
I agree, unfortunately Moz doesn't have an internal disallow feature that gives you the option to feed them info on where rogerbot can and can't go. I haven't come across any issues with this approach, crawl depth by search engine bots will not be affected since the user-agent is specified.
-
Thanks for the solution! We have been coming across a similar issue with some of our sites and I although I'm not a big fan of this type of workaround, I don't see any other options and we want to focus on the real issues. You don't want to ignore the rule in case other pages that should be indexed are marked noindex by mistake.
Logan, are you still getting the depth of crawls after making this type of fix? Have any other issues arisen from this approach?
Let us know
-
Hi Nichole,
You're correct in noindexing these pages, they serve little to no value from an SEO perspective. Moz is always going to alert you of noindex tags when they find them since it's such a critical issue if that tag shows up in unexpected places. If you want to remove these issues from your crawl report, add the following directive to your robots.txt file, this will prevent Moz from crawling these URLs and therefore reporting on them:
User-agent: rogerbot
Disallow: /tag/
Disallow: /category/*edit - do not prevent all user-agents from crawling these URLs, as it will prevent search engines from seeing your noindex tag, they can't obey what they aren't permitted to see. If you want, once all tag & category pages have been removed from the index, you can update your robots.txt to remove the rogerbot directive and add the disallows for tag & category to the * user agent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why doesn't Moz crawl whole pages of our website to report All On-Page issues?
Hi friends & mozzers, How can't Moz crawl whole pages of our website: https://www.4atvtires.com/ to report All Serious On-Page issues. We have more than 15000 product pages. And how could it be possible that Moz isn't able to crawl whole, just got crawl report upto 258 pages of our website, and also I can experience the same in Google webmaster ?? Please help to fix this issue as early as possible. Regards,
Moz Pro | | BigSlate
Rann0 -
Why is Moz Reporting as Duplicate Page Titles?
Our most recent MOZ crawl campaign is reporting 931 duplicate page title errors, most of which are "Product Review" pages like the following. Although there is only one review on this page, http://www.audiobooksonline.com/Cell_Stephen_King_unabridged_compact_discs.html, MOZ is reporting 15 duplicate page title, four of which I present below. http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/name/desc
Moz Pro | | lbohen
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/rating/asc
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/rating/desc
http://www.audiobooksonline.com/reviews/review.php/full/0743554337/0/state/asc Why is MOZ reporting these "pages" as duplicate page title errors? Are these errors hurting our SEO? How to fix?0 -
Has any on else experienced a spike in crawl errors?
Hi, Since the last time our sites were crawled in SEOmoz they are all showing a spike in Errors. (Mainly duplicate page titles and duplicate content). We haven't changed anything to the structure of the sites but they are all using the same content management system. The image is an example of what we are witnessing for all our sites based on the same system. Is anyone else experiencing anything similar? or does anyone know of any changes that SEOmoz has implemented which may be affecting this? Thanks in advance, Anthony. WzdQV WzdQV WzdQV.jpg WzdQV.jpg
Moz Pro | | BallyhooLtd1 -
Domain and Page authority dropped on home page
My two strongest links did not show up in the latest Open Site Explorer run. Business.com and BOTW.com...Linking domains went from 21 to 15. Domain dropped 3 pts and Page authority dropped 3 pts I believe. Is there a function in SEOmoz.org that lets you track your domain scores over time? Thanks, Boo
Moz Pro | | Boodreaux0 -
Last Linkscape index update: 05/30/2012
When will the next Linkscape index update occur? I've been waiting to run our backlink profile numbers, but the Last Linkscape index update was 05/30/2012? Thanks!
Moz Pro | | larahill0 -
7 Days into My Account and Only Crawled 3 Pages - What's Happened?
Am I doing something wrong? So far only 3 pages of my site have been crawled - but my account has been live for 7 days. Would it tell me if it was having trouble crawling the rest of the site?
Moz Pro | | columbus0 -
Confounding "Accessible to Engines" error?
Most of the pages on our site "Accessible to Engines" test in the SEOmoz reports. We cannot find any problem with the code and it's largely identical to the few pages that come up with an "A" score. One item that may be a reason is that we use meta http-equiv="refresh" content="600; For example in www.weatherzone.com.au/nsw/sydney/sydney We use this to fresh dynamic content on our site. Do search engines penalise pages that use this form of page refresh? Alternatively, is there a known bug in the SEOmoz "Accessible to Engines" report? Many thanks
Moz Pro | | weatherzone0