Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Search console validation taking a long time?
-
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing.
Thank you! ^_^
-
You're welcome.
We as a community are here to help.If your issue is now fixed, you could mark this question as answered
Best luck.
GR -
Cool! Thanks Gaston! I'm glad I asked about this! ^_^
-
What sometimes happens is that when some URLs are marked as noindex, googlebot reduces its crawling frequency as they interpret that you really don't want that page to be indexed thus has value or so ever.
What you just did is tell GoogleBot to crawl specifically that page and "force" it to analyze and render that page. So GoogleBot now understood that the noindex is no longer set and that page should be indexed.
I'd wait a few days so that googlebot naturally crawls all your site again and eventually index every page that deserves to be indexed.If that doesnt happen in about 2 weeks, then there is a tool in the old Search Console, where you can tell GoogleBot to Crawl a single page and its links. That is under Crawl-> Fetch as Google. Request an URL to fetched, after a few minutes it a button: _Request indexing_will appear, there you'll have the option to "Crawl this URL and its direct links". This image might came handy: https://imgur.com/a/y5DbUVw
I'm glad it helped previously and hope the last helps you even more.
Best luck.
GR -
Whoooooooaaaaahhhhhh! that fixed it! what's the deal!? lol. why is this method instantaneous and the other method I was pointed to by google is taking months?....do I have to do this with each individual URL?
-
....or maybe that's what it found the last time it was crawled? I clicked the "request indexing" button.....we'll see what happens.
-
hmmm. it says:
Indexing allowed? No: 'noindex' detected in 'robots' meta tag....but I have the settings in yoast set up to allow indexing.....do you think maybe changes in yoast settings aren't applying retroactively?....
-
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwifThere you might get a hint of what is going on.
-
They're in google search console, but I have tried searching for a couple of them and they don't appear to be indexed :-(. I tried the method you suggested and that didn't bring up anything either.
-
Hi angela,
Those 5 out of 64 URLs.. Is that a report in Search Console? or only 5 URLs appear when searching in Google?
Search Console usually takes a little longer to update its reports on index status.Have you tried a site: search? Also using _inurl: _parameter.
For example: site:domain.com inurl:/category-noindexed/Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Validator vs. Rich Results Test
I am working on a schema markup project. When I test the schema code in the Schema Markup Validator, everything looks fine, no errors detected. However, when I test it in the Rich Results Test, a few errors come back.
Intermediate & Advanced SEO | | Collegis_Education
What is the difference between these two tests? Should I trust one over the other?1 -
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of 2 of our main pages completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Intermediate & Advanced SEO | | Vuly1 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Keyword difficulty and time to rank
Hello, Is there a correlation between the keyword difficult and the time it takes to rank ? In other words let's say I try to rank for the keyword "seo" and it is going to take 2 years to rank 1 st whereas if I go for "best seo tools in 2018" and it takes just 2 weeks ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How to Submit My new Website in All Search Engines
Hello Everyone, Can Any body help to suggest Good software, or Any other to easily Submit my website , to All Search Engines ? ? Any expert Can help please, Thanx in Advance
Intermediate & Advanced SEO | | falguniinnovative0 -
Is having a .uk.com domain a hindrance for long-term SEO?
I know there has been some mention on Moz Q&A for .uk.com, but not for at least 3 years. So I wanted to see if any Mozzers out there knew if having a .uk.com domain would hinder our SEO long-term? Our company is finally now taking SEO seriously and we're planning some great stuff for the year ahead, but I have a feeling that our .uk.com domain may prevent us from out-ranking some of the bigger companies out there. Does anyone have any thoughts about this out there? Thanks 🙂
Intermediate & Advanced SEO | | JamesPearce0 -
Soft Hyphenation: Influence on Search Engines
Does anyone have experience on soft hyphenation and its effects on rankings? We are planning to use in our company blog to improve the layout. Currently, every word above 4 syllable will be soft hyphenated.
Intermediate & Advanced SEO | | zeepartner
This seems to render okay in all browsers, but it might be a problem with IE9... In HTML 5, the "" soft hyphenation seems to be replaced with the <wbr> Tag (http://www.w3schools.com/html5/tag_wbr.asp) and i don't find anything else about soft-hyphenation in the specs. Any experiences or opinions about this? Do you think it affects rankings if there are a lot of soft hyphens in the text? Does it still make sense to use or would you switch to <wbr> already?0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0