Submitted URL marked 'noindex'
-
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt.
Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+.
There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt.
Can anyone please suggest me a solution here?
-
Then we DO need to see an example to work out why it's firing
-
Those pages are allowed everywhere.
-
No, we haven't used Meta no-index tag in our html code. We don't even have no-index in X-Robots.
-
Forget robots.txt, it has nothing to do with pages being marked no-index. Either somewhere in your code (the HTML) the Meta no-index tag is being used, or it is being fired through your HTTP header via X-Robots. If you share a URL example we can work out which of those it is, which at least will narrow it down a little for you!
-
If you have not set the robots-tag to noindex in Yoast and you don't have hardcoded it somewhere in your head, there is still the wordpress option to disallow search-engines to crawl/index pages. Somewhere under settings is a checkbox.
without more details we can just guess...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I create a segment that shows me all pages using a certain keyword? But nothing that doesn't have that keyword?
There must be an easy answer to this, but I can't seem to find it. All I want to do is create a segment in Google Analytics that shows all pages and search strings with "orthopaedics" in the title, with pageviews, uniques etc. If I simply navigate to "All Pages" in Google Analytics and then click Advanced Filters and do an Include Page Contains "orthopaedics" it works just fine. (See attached Screen Shot) But when I try to recreate this as a segment, it pulls in all other pages the users visited before arriving on the orthopaedics page I want to include, which I don't want. I can manually exclude each URL I don't want, but this is tedious and I feel there must be a simpler method I'm just missing. At the end of the day, I'm trying to create a list of every page and dynamically created query string that includes the word "orthopaedics" to say doctor X, your orthopaedics section generated X views, and here's a list of the pages. Mm6YTKa
Reporting & Analytics | | Patrick_at_Nebraska_Medicine0 -
I get a - 'Temporarily unreachable' error message when I 'Fetch as Google' Any ideas please??
I wanted to Fetch this page and got this error from Google - Temporarily unreachable. I've never had this issue before?? I checked another page and it came back as 'Complete', so no problems there? Any ideas? Thank you in advance.
Reporting & Analytics | | MissThumann0 -
Weird URL Structure in GA
Hey everyone, Thanks in advance for any insight on this. I've been researching it quite a bit on Google and haven't found anything yet. In Analytics, under our pages report, we're getting a lot of pages that look like this: www.execucar.com/https://www.execucar.com or www.execucar.com/https://www.execucar.com/locations/orlando-car-service Any thoughts on how to fix this? These pages don't exist...I'm at such a loss.
Reporting & Analytics | | SuperShuttle0 -
We have a client that wants to apply UTM URL tagging to track local organic traffic in Google Analytics. Is there any benefit in doing this?
One of our clients requested that we apply UTM URL tagging to better track organic traffic in Google Analytics. We found this to be an odd request because we are most familiar with UTM tracking for special campaigns (referral tracking, PPC, email tracking, etc). Is there any benefit of applying UTM tags to urls to analyze local organic traffic in Google Analytics? Are there any resources out there about this? Thanks!
Reporting & Analytics | | RosemaryB0 -
Google is not indexing all URLs
My website have company and events profile from 200 countries. So it does have lots of URL. Earlier in August 2014, Google used to crawl 90% of URLs we submit. Thing goes wrong when we shifted from http to https. We lost traffic. But we are gaining it slowly. Main concern is that, It still does not indexed all submitted URLs. It have crawled merely 8% of all URLs submitted. site address is businessvibes.com Any help would be appreciated.
Reporting & Analytics | | irteam0 -
What's the best way to figure out which keywords are the highest converting?
We have a client using Google Analytics. They currently have 3 goals set up to track when website visitors fill out 3 forms: Form A, Form B, Form C. I can easily figure out what traffic sources have driven the highest number of conversions on each form (Search for Form A, for instance, or Referrals for Form B), but of course, when I try to drill down on search terms that have driven conversions to each form, I get stuck in "not provided" territory. I'd like to know what people are searching for when they ultimately fill out each form. This will answer questions like: are people familiar with us already when they convert, or did they randomly find our website when searching for something we sell? It seems like there must be a way, using Google Webmaster Tools, Analytics, or another third-party app, to answer the question: what keyword searches are responsible for the highest number of conversions? Especially on a website that has traffic of 10,000+/month and a healthy dose of search traffic. Right? Where am I missing this information?
Reporting & Analytics | | timfrick1 -
How to detect where Google gets indexed URL's
Google index some kind of way some links that create duplicate content. We doesn't understand how these are created so we would like detect where Google robots find these links. We tried: Moz Crawl Diagnostics but it shows 0 as Internal Link Count for these kind of links. Find some information from Google Analytics, that maybe there is trace (site content - all content) from visitors side. There wan't. We tried to find some information in Webmaster Tools under Internal link and HTML Improvements but didn't find any trace. Tried some search commands. Is there maybe some good one to search. TO search URL's form code with https://search.nerdydata.com.
Reporting & Analytics | | raido0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0