Submitted URL marked 'noindex'
-
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt.
Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+.
There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt.
Can anyone please suggest me a solution here?
-
Then we DO need to see an example to work out why it's firing
-
Those pages are allowed everywhere.
-
No, we haven't used Meta no-index tag in our html code. We don't even have no-index in X-Robots.
-
Forget robots.txt, it has nothing to do with pages being marked no-index. Either somewhere in your code (the HTML) the Meta no-index tag is being used, or it is being fired through your HTTP header via X-Robots. If you share a URL example we can work out which of those it is, which at least will narrow it down a little for you!
-
If you have not set the robots-tag to noindex in Yoast and you don't have hardcoded it somewhere in your head, there is still the wordpress option to disallow search-engines to crawl/index pages. Somewhere under settings is a checkbox.
without more details we can just guess...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I make sure that we are only tracking for single URLs?
Is there a way to track in Google analytics where part of the URL is excluded. For example, we need to track when customers complete an application form, however whenever a new form is completed a new URL is created. This makes it difficult to track pages in GA as there are so many URLs.
Reporting & Analytics | | Sable_Group0 -
Referral Data Q's
1. We recently ran a promotion on both FB and Reddit, which is https, linking to our non-https site. We utilized UTM links to our landing page. Our GA campaign data returned extremely low hits in comparison to what we actually received (and recorded via FB/Reddit dashboard). Obviously our Direct traffic spiked during these times, caused by a secure to nonsecure referral, I'm sure. I'm also noticing a spike in referral traffic from lm.facebook.com that correlates to the ad times. Does this mean Facebook's link shim is stripping away my UTM data? My question is why we receive SOME properly UTM-tagged referral traffic in our campaigns? What's allowing some of it to go through? 2. I've tagged our email signature links with UTM as well, hoping to clean up some of our Direct traffic. I understand that external clients like Outlook and Thunderbird likely won't pass referral data, but do hosted clients like Gmail, Yahoo, and such? And if so, would the https to http difference obstruct this again? I'd love some insight onto these questions, especially if I'm off the mark with a few of my assumptions there.
Reporting & Analytics | | kirmeliux0 -
Find out who mentioned my URL on Facebook
I am seeing a big spike in traffic coming from Facebook. The referrer looks like this (at least sometimes, mostly it is just https://m.facebook.com or http://www.facebook.com/😞 http://www.facebook.com/l.php?u=http%3A%2F%2Fwww.terminretter.de%2F&h=rAQGSwECjAQGiZFaVEkz7U2od1RNFVtxhVQNG__EujSznUw&s=1 Is there a way to know who shared my URL on facebook?
Reporting & Analytics | | cengelhardt0 -
301 Redirect 'https'? First post - Newbie.
Good afternoon, Thank you in advance for your help - this is my first post and I am new to all of this. Situation: I've setup 301 redirects for www.thechiplab.com to my new site www.chiplab.com (recently launched e-commerce site on Magento) through cPanel. Problem: Some of my best links are to my old ''https:" www.thechiplab.com secure domain (ex. http://techcrunch.com/2006/12/22/why-doesnt-cafepress-use-flash/) and are not being "passed" on to the new domain. (Open Site Explorer) Is it possible to recover any of the PR from the old secure site? Thanks again, Chase
Reporting & Analytics | | chiplab0 -
Tracking an onpage 'event'
Hi all Wondering if anyone could help out with this one please. My client is a government backed free internet safety website and in the next few days they will be launching an update on all of their pages which will let people know if their browser is out of date. For example, when you go to their site you will get a message advising you to upgrade your browser for security reasons. They have employed the following code to check the browser The client are keen to know how many times in a period this message is shown to users. Any idea how one would go about tracking this please. Would it involve some custom GA work, would I be able to track the hits on https://wxxxx.org/javascript/update.js in GA? I'm a little stumped. Obviously I can tell how many people loaded the page but not sure how to work out what % of them see the javascript Many thanks for your help Carl
Reporting & Analytics | | GrumpyCarl0 -
URL Structure Q - /UniqueURL/ProductA or /SubcategoryURL/ProductA?
Hi Mozers, I have a niche ecommerce site http://www.ecustomfinishes.com that sells custom barn wood furniture. I have about 600 products online. 2 weeks ago I started rewriting my urls from /subcategoryurl/ProductA to /UNIQUEURL/productA for my individual products, For example for my subcategory farm tables (150 products) I had /rustic-farm-tables/productA, /rustic-farm-tables/ProductB ...."rustic-farm-table" about 150 times. 2 weeks ago I started changing the 150x "/rustic-farm-table/" to a more descriptive URL such as /white-farm-table/producA /rustic-square-dining-table/ProductB /Black-harvest-table/ProductC Here is why I am need advice: I have 1181 pages, the page with the most entrances with "rustic-farm-tables" is #31/1181 based on entrances. the 2nd most is #71/1181 Alternatively, I have 13 table product pages such a as /12ft-Rustic-Farm-Dining-Table-p/12-foot-table-with-inlay.htm" that get more entrances than any product that includes "rustic-farm-tables" Since changing the urls to be product specific, my overall traffic has dropped 20%!!! So here is my question: do i continue to have the /UNIQUEURL/product be unique to the product, which is consistant amongst my best preforming pages, yet has dropped my traffic 20% in the last 2 weeks, OR do i keep /SAME-URL/product which written as a best practice, and be happy with the traffic I had? Could the 20% drop just be a temporary shock? Why would this happen? This would be a good long tail/head term experiment. Try to get more head terms, or do what you can do focus on long tail. I hope i was able to explain this well, I say follow the best practices of my best preforming pages, however the 20% drop has me worried. Thank you in advance for your help
Reporting & Analytics | | longdenc_gmail.com0 -
URL-structure change - former long-tail traffic gone
Hey people, I'm sure many of you applied changes to the URL structure of a client's or your own website before. So did I for obvious reason: The structure before was like www.domain.com/brand_page/_22-key-word-translatedkeyword.php (ranked 20). This was changed to www.domain.com/key-word.html.
Reporting & Analytics | | dumperama
Edit: Also on-page it was optimized, but only taking out worthless links like "keyword-link to other page" and adding a relevant SEO text (also valuable for the user) Now, for the targeted short-tail keyword, the outcome was great - ranking increased by 17 landing the page on the first SERP. But: Before this page garnered a wide range of long-tail keyword traffic.To be exact: 2600 different keywords generated traffic for that page in a period of 1 month. Now the newly structured site (also on-page optimized) only receives traffic from around 100 keywords. You can imagine that the absolute amount of visits also dropped. So I'd like to know if you observed similar results. Another question that's coming up in this context: How regularly does Google refresh the keywords associated with a page? Like: Is this page really relevant for this one keyword we associated it with 5 years ago? Because it is clear, when I'm looking at the aforementioned 2600 KW in detail, most don't have anything to do with the site, i.e. are not mentioned at all. Still they generated valuable traffic though. All of this is really crucial to this project, because soon the whole website's supposed to be relaunched with optimized URL structure and of course everything else that's need SEO wise... I'd love to hear your experiences. Thanks!!0 -
Site crawler hasn't crawled my site in 6 days!
On 4.23 i requested a site crawl. My site only has about 550 pages. So how can we get faster crawls?
Reporting & Analytics | | joemas990