Search console validation taking a long time?
-
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing.
Thank you! ^_^
-
You're welcome.
We as a community are here to help.If your issue is now fixed, you could mark this question as answered
Best luck.
GR -
Cool! Thanks Gaston! I'm glad I asked about this! ^_^
-
What sometimes happens is that when some URLs are marked as noindex, googlebot reduces its crawling frequency as they interpret that you really don't want that page to be indexed thus has value or so ever.
What you just did is tell GoogleBot to crawl specifically that page and "force" it to analyze and render that page. So GoogleBot now understood that the noindex is no longer set and that page should be indexed.
I'd wait a few days so that googlebot naturally crawls all your site again and eventually index every page that deserves to be indexed.If that doesnt happen in about 2 weeks, then there is a tool in the old Search Console, where you can tell GoogleBot to Crawl a single page and its links. That is under Crawl-> Fetch as Google. Request an URL to fetched, after a few minutes it a button: _Request indexing_will appear, there you'll have the option to "Crawl this URL and its direct links". This image might came handy: https://imgur.com/a/y5DbUVw
I'm glad it helped previously and hope the last helps you even more.
Best luck.
GR -
Whoooooooaaaaahhhhhh! that fixed it! what's the deal!? lol. why is this method instantaneous and the other method I was pointed to by google is taking months?....do I have to do this with each individual URL?
-
....or maybe that's what it found the last time it was crawled? I clicked the "request indexing" button.....we'll see what happens.
-
hmmm. it says:
Indexing allowed? No: 'noindex' detected in 'robots' meta tag....but I have the settings in yoast set up to allow indexing.....do you think maybe changes in yoast settings aren't applying retroactively?....
-
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwifThere you might get a hint of what is going on.
-
They're in google search console, but I have tried searching for a couple of them and they don't appear to be indexed :-(. I tried the method you suggested and that didn't bring up anything either.
-
Hi angela,
Those 5 out of 64 URLs.. Is that a report in Search Console? or only 5 URLs appear when searching in Google?
Search Console usually takes a little longer to update its reports on index status.Have you tried a site: search? Also using _inurl: _parameter.
For example: site:domain.com inurl:/category-noindexed/Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cannibalization vs long tail keyword dilemma
Hi all. I have a dilemma that I'm trying to work out a solution to and could use some input. We offer a Foreign Qualification (FQ) service for businesses, and thus "foreign qualification" is a strong keyword for which we currently hold great ranking position for our service page. FQ is different in each state, so we have a series of blog posts focusing on the requirements for each state. "Alabama foreign qualification" is one of many long tail keywords (50 states x various phrasings) we're targeting here. The problem is that it's impossible to write 50 blog posts that are not very similar content, since the process is similar, just not identical, in each state. I'm worried about duplicate content penalties here. I'm thinking that I'd want to create a landing page that serves as a hub for each of these blog posts, perhaps with a reference table for the 50 states too, and set the blog post canonicals to this landing page (thereby pushing all state-focused long tail KWs there). However, I don't want to take away ranking strength of the aforementioned service page for the primary keyword. If I do this, and also link the new landing page to the service page using "foreign qualification" as the anchor text, am I more likely to add or take away from the strength of the service page? Thanks for any and all insight!
Intermediate & Advanced SEO | | mkupfer1 -
Hostage Taking by My Wordpress Developer
Since 2013 a Wordpress developer has coded my real estate website. Their hourly rate is $24 but the programmers take too long to perform tasks and the service has become prohibitively expensive. Examples of unreasonable time estimates below: | | 1. Change theme settings so posts/pages do not display a date. -> 7 hrs
Intermediate & Advanced SEO | | Kingalan1
2.Google search results are displaying the breadcrumb on the top of each page rather than the URL. Please correct so this does not display. -> 3 hrs
3. Install SSL certificate to www.metro-manhattan.com domain -> 8 hrs | | The above does not include 5-6 hours for testing. I am considering changing vendors. Potential programmers have asked how the site was developed and to what extent is it is customized. Ends up several plugins were built from scratch. My question is whether a new developer is going to be able to pick up a custom coded site. That without understanding how the site was built, any change will break the site. My concern is that current developer has made themselves indispensable, and created a situation where there is no alternative to using them and they can therefore charge any price they want.Any thoughts? Also below are questions I asked my developer about how the site was built and their answers: | 1. Was everything coded using a child theme?
No, is a custom theme. 2. Did you use any ready made theme or just plugins
We used the theme and and we've used plugins. Third party plugins and plugins builded from scratch 3. Can Wordpress and every one of the plugin be updated?
Wordpress can be updated, core files was never modified. If after an update something start to work wrong is due to some radical wordpress change or similar Can't be updated: FireStorm Professional Real Estate Plugin Created at xxx: Form Submissions Report Miscellaneous Hooks and Filters NYC Check memory usage NYC SEO listings NYC Slider Sitemap Updater 4. Were any of the plugins customized and if so, which ones?
Yes, this plugin "FireStorm Professional Real Estate Plugin" |0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
How to setup multiple pages in Google Search?
How to setup multiple pages in Google Search? I have seen sites that are arranged in google like : Website in Google
Intermediate & Advanced SEO | | Hall.Michael
About us. Contact us
Services. Etc.. Kindly review screenshot. Is this can achieved by Yoast Plugin? X9vMMTw.png0 -
Long urls created by filters (not with query parameters)
A website adds subfolders to a category URL for each filter that's selected. In a crawl of the website some of these URLs reach over 400 characters. For example, if I select shoe size 5, 5.5 and 6, white and blue colour, price $70-$100, heel and platform styles, the URL will be as follows: www.example.com/shoes/womens/filters/shoe-size--5--5.5--6/color--white--blue/price--70-100/style--heel--platform There is a canonical that points to www.example.com/shoes/womens/ so it isn't a duplicate content issue. But these URLs still get crawled. How would you handle this? It's not a great system so I'm tempted to tell them to start over with best practice recommendations, but maybe I should just tell them to block the "/filters/" folder from crawlers? For some products however, filtered content would be worth having in search indexes (e.g. colour).
Intermediate & Advanced SEO | | Alex-Harford0 -
Rankings and search traffic fell off a cliff
Hi Moz community, One of my clients has a beast of a website built in ASP.NET (which causes me problems cos I don't have much experience in that) It is a job-site that aggregates job opportunities from other job-sites and provides a job matching service by email etc. They used to have great presence on Google naturally for thousands of job searches. Since Penguin and Penguin 2.0 (I think) their traffic has fallen off a cliff. I have been doing some "off-page" experimentation, seeing if we can fix a lot of issues by re-sculpting their backlink profile (seeing as it was after penguin). but what I have found is that some pages respond to this off page work but some just do not at all, despite how we approach it, such as disavowing previous links building fresh new top quality content links with natural anchor text etc.... Which has lead me to the conclusion that the wider issue is on-page and potentially site structure. Unfortunately as it is ASP.NET I am not so comfortable diagnosing the issues. I think also some issues will be related to dupe content etc.... but I would LOVE to get some input from my learned Moz colleagues. The website is http://www.allthetopbananas.com/ - any tips on how to recover from this dramatic loss of traffic would be massively appreciated. Kind regards
Intermediate & Advanced SEO | | websearchseo0 -
Page indexed but not showing up at all in search results
I am currently working on the SEO for a roofing company. I have developed GEO targeted pages for both commercial and residential roofing (as well as attic insulation and gutters) and have hundreds of 1st page placements for the GEO targeted keywords. What is baffling me is that they are performing EXTREMELY poorly on the bigger cities, to the point of not evening showing up in the first 5 pages. I also target a page specifically for roof repair in Phoenix and it is not coming up AT ALL. This is not typically the results I get when directly targeting keywords. I'm working on implementing keyword variations as well as adding about 10 or so information pages (@ 700 words) regarding different roofing systems which I plan to cross link on the site, etc. I'm just wondering if there is a simple answer as to why the pages I want to be showing up the most are performing so poorly and what I would need to do to improve their rankings.
Intermediate & Advanced SEO | | dogstarweb0 -
Block search engines from URLs created by internal search engine?
Hey guys, I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory. One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website. I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt? What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index? Thanks Kim
Intermediate & Advanced SEO | | Voonie0