Why does SEOMos Pro include noindex pages?
-
I'm new to SEOMoz. Been digesting the crawl data and have a tonne of action items that we'll be executing on fairly soon. Love it!
One thing I noticed is in some of crawl warnings include pages that expressly have the ROBOTS meta tag with the "noindex" value. Example: many of my noindex pages don't include meta descriptions. Therefore, is it safe to ignore warnings of this nature for these pages?
-
Yup, helps! Thanks.
-
Hi Randy,
Basically, the crawler is not configured to remove pages with the "noindex" tag. Given that removing them would create a situation where you may have pages visible to users that might not contain information in other places like Titles, then leaving them in there is probably a reasonable choice.
However, there is a feature request in the works at SEOmoz which would provide the option to turn off the pages that you know can be ignored because they are "noindexed". This feature will allow us to have the best of both worlds - see all the deficiencies of the page IF we wish to, and turn them off so that they are eliminated from reports if we wish to ignore them.
For now, yes, you can ignore them.
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
How to Delete a Page on the Web?
Google reports and I have confirmed that the following old page is presenting on the Web. http://www.audiobooksonline.com/The_Great_American_Baseball_Box_Greatest_Moments_from_the_Last_80_Years_original_audio_collection_compact_discs.html This page hasn't been in our site's directory for some time and is no longer needed by us. What is the best way to fix this Google reported crawl error?
Technical SEO | | lbohen0 -
Wordpress html page
Hi, WE are designing new agency site which contain from then 100 page. Which URL is best excample.com/about/ or excample/about.html excample.com/service/ or excample/service.html
Technical SEO | | srinathk0 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
Redirecting over-optimised pages
Hi One of my clients websites was affected by Penguin and due to no 'bad link' messages, and nothing really obvious from the backlink profile, I put it down to over-optimisation on the site. I noticed a lot of spammy pages and duplicate content, and submitted recommendations to have these fixed. They dragged their heels for a while and eventually put in plans for a new site (which was happening anyway), but its taken quite a while and is only just going live in a couple of weeks. My question is, should I redirect the URLs of the previously over-optimised pages? Obviously the new pages are nice and clean and from what I can tell there are no bad links pointing to the URLs, so is this an acceptable practice? Will Google notice this and remove the penalty? Thanks
Technical SEO | | Coolpink0 -
Index page 404 error
Crawl Results show there is 404 error page which is index.htmk **it is under my root, ** http://mydomain.com/index.htmk I have checked my index page on the server and my index page is index.HTML instead of index.HTMK. Please help me to fix it
Technical SEO | | semer0 -
Pages plummeting in ranking
Hi all, I have a question, which i hope you can answer for me. I have a site www.betxpert.com (a danish betting site) and we have tried to do some SEO to improve conversions. One of the steps we have taken was to link to all of our bookmaker reviews in our menu (a mega menu). All of our bookmakers have an img and text link in the menu. The menu is shown on every page of the site. Since we have made this change we have been plumeting down the SERPs. For the search "betsafe" this page http://www.betxpert.com/bookmakere/betsafe is no longer in the top 50. We also added the "stars" so that the google result will show our over all review for the bookmaker, in order to stand out in the SERPs. Can anyone explain to me what the problem might be? Over extensive internal linking or?
Technical SEO | | rasmusbang0