Submitted URL marked 'noindex'
-
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt.
Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+.
There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt.
Can anyone please suggest me a solution here?
-
Then we DO need to see an example to work out why it's firing
-
Those pages are allowed everywhere.
-
No, we haven't used Meta no-index tag in our html code. We don't even have no-index in X-Robots.
-
Forget robots.txt, it has nothing to do with pages being marked no-index. Either somewhere in your code (the HTML) the Meta no-index tag is being used, or it is being fired through your HTTP header via X-Robots. If you share a URL example we can work out which of those it is, which at least will narrow it down a little for you!
-
If you have not set the robots-tag to noindex in Yoast and you don't have hardcoded it somewhere in your head, there is still the wordpress option to disallow search-engines to crawl/index pages. Somewhere under settings is a checkbox.
without more details we can just guess...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding a Query String to a Static URL is that good or bad?
I just went through this huge process to shorten my URL structure and remove all dynamic strings. Now my analytics team wants to add query strings to track clicks from the homepage. Is this going to destroy my clean url structure by appending a query string to the end of the URL structure.
Reporting & Analytics | | rpaiva0 -
URL Parameters
Hi there, I have a magento sort by feature which has indexed loads of pages in Google with urls that have /shopby/ in them.Over 8k pages have been indexed like this. I cannot edit the robots within the page but have now disallowed the urls in robots.txt - i guess this will prevent new ones being indexed but not deindex current ones? So I looked into URL parameters, I added 'shopby' as a parameter in webmaster tools and told Google not to crawl any urls with this in it, will this deindex the pages already indexed? The only other way seems to be manually removing 8k urls, which i do not want to do. Any advice much appreciated. Obviously I do not want these urls indexed as they are weak/duplicate sort by search pages, I fear the panda update would not be too kind on it long term?
Reporting & Analytics | | tdigital0 -
Google Analytics Organic Search Keywords Suddenly Displaying FulL Urls
In my Google Analytics, the top keywords for Organic Search are suddenyl displaying full URLs. For example, now the third and fourth keywords are http://www.domain.com/highly-specific-URL. These have all started recently around the same day, July 12th. I've checked back, and we've made no internal changes to the site around that time that could affect this. Any thoughts on this? Thanks! P.S. It might be related to rich snippets, but I cannot tell at this point.
Reporting & Analytics | | 10SL0 -
Can't seem to rank for keyword "home care grand rapids" - need some advice
I am trying to rank for "home care grand rapids" and am having a really difficult time. My site: http://healthcareassociates.net has better backlinks, keywords and other seo markers than my competitors but I still can't seem to rank. The keyword and associated keywords (home care grand rapids michigan, home health care grand rapids, etc.) are only 31-33% difficulty and my site/page rank is better than the leading sites. What gives? Todd
Reporting & Analytics | | t1kuslik0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
404 errors on page urls that don't even exist
I am getting a lot of errors on pages with urls that aren't even legit. Like for example: /videos/support/index.asp No such path even exists like this on the site. I have a /videos and /support off root but no place on the site is there any reference or file at location /videos/support/index.asp so I get a lot of 404 duplicate page errors. This is just one example of several. How do I stop this?
Reporting & Analytics | | GKLWL0 -
What impact will Google's 10/18/2011 announcement of 'Making Search More Secure' have on the ability to track specific keyword queries via Analytics?
The full announcement is here: http://googleblog.blogspot.com/2011/10/making-search-more-secure.html My concern is that the ability for Google Analytics to parse information on specific keyword queries will be diminished. The article hints that Google Webmaster Tools will be exempt from the problem, and I've never relied on Webmaster tools as a go-to for tying specific keyword queries to Goal Tracking (form submissions and sales). The community's thoughts on this one are appreciated. 🙂
Reporting & Analytics | | MKR_Agency0 -
Phantom urls causing 404
I have a very strange problem. When I run SEOmoz diagnostics on my site, it reveals urls that I never created. It seems to combine two slugs into a new url. For example, I have created the pages http://www.naplesrealestatestars.com/abaco-bay-condos-naples/ and http://www.naplesrealestatestars.com/beachwalk-naples-florida/ and now the url http://www.naplesrealestatestars.com/abaco-bay-condos-naples/beachwalk-naples-florida/ exists in addition to the two I created. There are over 100 of these phantom urls and they all show a 404 error when clicked on or crawled by SEOmoz. Any body know how to correct this?
Reporting & Analytics | | DanBoyle760