How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
-
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed.
Does anyone have any experience figuring this one out?
-
@intellect did you find a solution to that?
-
-
@dalerio-consulting what should can we do with excluded section then. let say this page of my website is under duplicate canonical tag in excluded section. then should i leave it if its not very serious or should i request indexing ? Are these excluded pages issues very serious to take?
-
Hey Brett!
Basically what we believe this status means is Google saying "I can crawl and access the URL but I don't believe this page belongs in the index". They key here is to figure out why Google might not believe the page should be considered for indexation. We analyzed a good number of Index Coverage reports across all of our different clients.
Here are the most commons reasons URLs get reported as "Crawled - Currently Not Indexed":
- False positives
- RSS Feed URLs
- Paginated URLs
- Expired products
- 301 redirects
- Thin content
- Duplicate content
- Private-facing content
You can find a breakdown of each reason on the post we wrote here: https://moz.com/blog/crawled-currently-not-indexed-coverage-status
However, there's likely many more reasons why Google does't think the page is eligible for indexation.
-
Crawled - Currently not indexed is the most common way for pages or posts on your site not to be indexed. It is also the most difficult one to pinpoint because it happens for a multitude of reasons.
Google needs computing power to analyze each website. How it works is that Google assigns a certain crawl budget to each site, and that crawl budget determines how many pages of your site will be indexed. Google will always index your top pages, therefore, the excluded pages are of less quality rank-wise.
Every website has pages that are not indexed, and the healthy ratio of non-indexed pages will depend on the niche of the website.
There are however 2 ways for you to get your pages out of the "Crawled - Currently not indexed" pit:
- Decrease the number of pages/posts. It's a matter of quality v quantity, so make sure that put more attention into linking every new post so that they get indexed in no time. Don't forget to utilize robots.txt to block pages that aren't useful to the site from indexing so that the crawl budget can be assigned to the other posts.
- Increase the crawl budget. You can do that by raising the quality of the pages/posts. Make more internal and external backlinks for your posts and homepage, make sure that the articles are unique and keyword-optimized, and work hard to aim so that each article will rank on that first page.
SEO is a tough business, but if managed carefully, over time it will pay off.
Daniel Rika - Dalerio Consulting
https://dalerioconsulting.com
info@dalerioconsulting.com -
Crawled - currently not indexed list includes sitemap and robots.txt
We have searched and try to understand this issue. But we did not get final result regarding this issue
If any one fixed this issues, please share your suggestions as soon as possible
-
Hi There,
Google has been struggling to eliminate spam pages, content and structurally ordering them; this is an inherent problem especially with badly structured e-commerce websites.
You might be aware that "Crawled - Currently Not Indexed" means that your page(s) has been found by Google but it is not currently indexed, this might not be an error, just that your pages are in a queue. That might be due to the following reasons:
- There are a lot of pages to index, so it's going to take Google some time to get through them and mark them as either indexed or not.
- There might be duplicate pages / canonical issues for the website of the pages. Google might be seeing a lot of duplicate pages without canonical tags on your site, to improve the number of pages indexed you need to either improve pages so they are no longer duplicated or add canonical tags to help Google attribute it to the correct page
You need to justify each and every page for their merits, and then let google decide whether it think it should be available in their search and also against what keywords at what rank. To summarise, just help 'Google search' by structuring your data right, it might reward you by ranking your pages at right places for the right keywords.Thanks and Regards,Vijay
-
Search Console > Status > Index Coverage > Crawled - currently not indexed
Yes, I had the same Issues last month, in my case the crawler took it 6 weeks to update the Index Coverage. And apparently, there are not too many things that you can do it about it.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The difference between organic searches in Acquisition and organic searches in Default Channel Grouping
Hi guys, We have a question. In Google Analytics, there are 2 types of identifying organic searches: through Acquisition and through Default Channel Grouping. On our website, we have some differences between the number of organic sessions. Which one do you think is more relevant? Which one do you use? Many thanks in advance!
Reporting & Analytics | | RIDGID_Europe0 -
Google Webmaster. Backlinks
GWMT only shows that there are 3 domains pointing to a site of mine. I'm looking under "Links to site". But this can't be true because the site is pretty old and I know there are hundreds of domains that point to this one. What would explain this discrepancy? And is there some other free tool that will show all the backlinks? I've used Opensite explorer but that tool isn't close to comprehensive as GWMT usually is (based on other sites I've analyzed)
Reporting & Analytics | | priceseo0 -
I have few similar job forms that were created for different positions. SEOMoz says, its "duplicate pages". So how do I resolve it? I want my jobs to be searchable in Search Engines.
Hi There, I have few similar job forms that were created for different positions. SEOMoz says, its "duplicate pages". So how do I resolve it? I want my jobs to be searchable in Search Engines. Thanks !
Reporting & Analytics | | pointstar0 -
Google Analytics and backlinking
Let say I have my main site and my secondary site that is optimized for a slightly different set of keywords (nonetheless still relevant to my main site). I have several links from a secondary site to my main site. Secondary site is on a different C-block. Do you ladies and gentlemen think that if I put both websites under the same google analytics account, Google is going to penalize me or remove some of the juice that is flowing from secondary site to the main site because it would detect through GA that both sites belong to the same entity?
Reporting & Analytics | | SirMax0 -
500 errors and impact on google rankings
Since the launch of our newly designed website about 6 months ago, we are experiencing a high number of 500 server errors (>2000). Attempts to resolve these errors have been unsuccessful to date. We have just started to notice a consistent and sustained drop in rankings despite our hard sought efforts to correct. Two questions... can very high levels of 500 errors adversely effect our google rankings? And, if this is the case, what type of specialist (what are they called) has expertise to investigate and fix this issue. I should also mention that the sitemap also goes down on a regular basis, which some have stated is due to the size of the site (>500 pages). Don't know if they're part of the same problem? Thanks.
Reporting & Analytics | | ahw0 -
High bounce rate from Google Shopping
Hi Mozzers, I'm carrying out some analysis on our eCommerce site and the bounce rate from Google Shopping is well above the site average at 60%. Our shopping feed is submitted to Google every morning so we know that images and prices are up-to-date which would obviously cause a high bounce rate. Any ideas on what might cause this? Is it normal for Google Shopping to produce a high bounce rate? Cheers guys!
Reporting & Analytics | | Confetti_Wedding0