Why does SEOMos Pro include noindex pages?
-
I'm new to SEOMoz. Been digesting the crawl data and have a tonne of action items that we'll be executing on fairly soon. Love it!
One thing I noticed is in some of crawl warnings include pages that expressly have the ROBOTS meta tag with the "noindex" value. Example: many of my noindex pages don't include meta descriptions. Therefore, is it safe to ignore warnings of this nature for these pages?
-
Yup, helps! Thanks.
-
Hi Randy,
Basically, the crawler is not configured to remove pages with the "noindex" tag. Given that removing them would create a situation where you may have pages visible to users that might not contain information in other places like Titles, then leaving them in there is probably a reasonable choice.
However, there is a feature request in the works at SEOmoz which would provide the option to turn off the pages that you know can be ignored because they are "noindexed". This feature will allow us to have the best of both worlds - see all the deficiencies of the page IF we wish to, and turn them off so that they are eliminated from reports if we wish to ignore them.
For now, yes, you can ignore them.
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Noindex, follow" for thin pages?
Hey there Mozzers, I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think? Has someone faced this situation before? Will appreciate your input!
Technical SEO | | Europarl_SEO_Team0 -
Removing indexed pages
Hi all, this is my first post so be kind 🙂 - I have a one page Wordpress site that has the Yoast plugin installed. Unfortunately, when I first submitted the site's XML sitemap to the Google Search Console, I didn't check the Yoast settings and it submitted some example files from a theme demo I was using. These got indexed, which is a pain, so now I am trying to remove them. Originally I did a bunch of 301's but that didn't remove them from (at least not after about a month) - so now I have set up 410's - These also seem to not be working and I am wondering if it is because I re-submitted the sitemap with only the index page on it (as it is just a single page site) could that have now stopped Google indexing the original pages to actually see the 410's?
Technical SEO | | Jettynz
Thanks in advance for any suggestions.0 -
Linking Pages - 404s
Hello, I have noticed that we have recently managed to accrue a large number of 404s that are listed as Page Title/URL of Linking Page in Moz (e.g. http://www.onexamination.com/international/) but I do not know which site they are coming from, is there an easy why to find out or shall we just create redirects for them all? Thanks in advance for your help. Rose
Technical SEO | | bmjcai1 -
Google showing https:// page in search results but directing to http:// page
We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?
Technical SEO | | amiraicaew0 -
Why are my Duplicated Pages not being updated?
I've recently changed a bunch of duplicated pages from our site. I did get a slightly minimized amount of duplicated pages, however, some of the pages that I've already fixed are still unfixed according to MOZ. Whenever I check the back-end of each of these pages, I see that they've already been changed and non of them are the same in terms of Meta Tag Title is concern. Can anyone provide any suggestions on what I should do to get a more accurate result? Is there a process that I'm missing?
Technical SEO | | ckroaster0 -
Client error 404 pages!
I have a number of 404 pages coming up which are left over in Google from the clients previous site. How do I get them out of Google please?
Technical SEO | | PeterC-B0 -
Secondary Pages Indexed over Primary Page
I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?
Technical SEO | | Bucky0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0