Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Search console validation taking a long time?
-
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing.
Thank you! ^_^
-
You're welcome.
We as a community are here to help.If your issue is now fixed, you could mark this question as answered
Best luck.
GR -
Cool! Thanks Gaston! I'm glad I asked about this! ^_^
-
What sometimes happens is that when some URLs are marked as noindex, googlebot reduces its crawling frequency as they interpret that you really don't want that page to be indexed thus has value or so ever.
What you just did is tell GoogleBot to crawl specifically that page and "force" it to analyze and render that page. So GoogleBot now understood that the noindex is no longer set and that page should be indexed.
I'd wait a few days so that googlebot naturally crawls all your site again and eventually index every page that deserves to be indexed.If that doesnt happen in about 2 weeks, then there is a tool in the old Search Console, where you can tell GoogleBot to Crawl a single page and its links. That is under Crawl-> Fetch as Google. Request an URL to fetched, after a few minutes it a button: _Request indexing_will appear, there you'll have the option to "Crawl this URL and its direct links". This image might came handy: https://imgur.com/a/y5DbUVw
I'm glad it helped previously and hope the last helps you even more.
Best luck.
GR -
Whoooooooaaaaahhhhhh! that fixed it! what's the deal!? lol. why is this method instantaneous and the other method I was pointed to by google is taking months?....do I have to do this with each individual URL?
-
....or maybe that's what it found the last time it was crawled? I clicked the "request indexing" button.....we'll see what happens.
-
hmmm. it says:
Indexing allowed? No: 'noindex' detected in 'robots' meta tag....but I have the settings in yoast set up to allow indexing.....do you think maybe changes in yoast settings aren't applying retroactively?....
-
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwifThere you might get a hint of what is going on.
-
They're in google search console, but I have tried searching for a couple of them and they don't appear to be indexed :-(. I tried the method you suggested and that didn't bring up anything either.
-
Hi angela,
Those 5 out of 64 URLs.. Is that a report in Search Console? or only 5 URLs appear when searching in Google?
Search Console usually takes a little longer to update its reports on index status.Have you tried a site: search? Also using _inurl: _parameter.
For example: site:domain.com inurl:/category-noindexed/Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Is having a .uk.com domain a hindrance for long-term SEO?
I know there has been some mention on Moz Q&A for .uk.com, but not for at least 3 years. So I wanted to see if any Mozzers out there knew if having a .uk.com domain would hinder our SEO long-term? Our company is finally now taking SEO seriously and we're planning some great stuff for the year ahead, but I have a feeling that our .uk.com domain may prevent us from out-ranking some of the bigger companies out there. Does anyone have any thoughts about this out there? Thanks 🙂
Intermediate & Advanced SEO | | JamesPearce0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
How many time should a keyword be used in the body of text?
We employee an outside agency to write content for our website as we do not have the ability in house to write unique and good quality content. They have just sent an article which is around 300 words. I told them the keyword phrases to use. When I got the document there is only 1 instance of the keyword phrase(s) in it. Now there seems to be a conflict here amongst posts I have read and general SEO advise as to how many times it should be present (SEOmoz indicates 4 times for instance), our outside agency says it doesn't matter. Now if I have a page optimised for 2 keywords this starts making things tricky and probably looks keyword stuffed to the reader. Assuming the keywords are present once in meta tags, H1, meta descriptions and alt text, what do people think is best practice taking into account recent panda updates? Thoughts appreciated. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Submitting URLs multiple times in different sitemaps
We have a very dynamic site, with a large number of pages. We use a sitemap index file, that points to several smaller sitemap files. The question is: Would there be any issue if we include the same URL in multiple sitemap files? Scenario: URL1 appears on sitemap1. 2 weeks later, the page at URL1 changes and we'd like to update it on a sitemap. Would it be acceptable to add URL1 as an entry in sitemap2? Would there be any issues with the same URL appearing multiple times? Thanks.
Intermediate & Advanced SEO | | msquare0 -
How long should a domain redirect take?
Hi, I know that this is a 'How long is a piece of string?' type question but at what point should the ranking value of site A pass over to site B following a domain 301 redirect? I have shifted a domain over to a new URL, same hosting server, same IP address. I haven't made any URL changes or any content changes other than to change the site logo to match the new domain name. Domain B is basically an exact clone of domain A. I have redirected Domain A to domain B using the following line at the top of the .htaccess file:- Redirect 301 / http://www.newdomain.com/ I have submitted a sitemap for the new domain via google webmaster tools. It looks like the original domain as been completely indexed by google following the redirect as all rankings have been dropped from the results and there are no results for a site:olddomain.com search. Surely the rankings should have switched over at this point? Any help would be much appreciated.
Intermediate & Advanced SEO | | AdeLewis
Ade.0 -
Do search engines understand special/foreign characters?
We carry a few brands that have special foreign characters, e.g., Kühl, Lolë, but do search engines recognize special unicode characters? Obviously we would want to spend more energy optimizing keywords that potential customers can type with a keyboard, but is it worthwhile to throw in some encoded keywords and anchor text for people that copy-paste these words into a search? Do search engines typically equate special characters to their closest English equivalent, or are "Kuhl", "Kühl" and "Kühl" three entirely different terms?
Intermediate & Advanced SEO | | TahoeMountain400