Google has discovered a URL but won't index it?
-
Hey all, have a really strange situation I've never encountered before. I launched a new website about 2 months ago. It took an awfully long time to get index, probably 3 weeks. When it did, only the homepage was indexed.
I completed the site, all it's pages, made and submitted a sitemap...all about a month ago. The coverage report shows that Google has discovered the URL's but not indexed them. Weirdly, 3 of the pages ARE indexed, but the rest are not.
So I have 42 URL's in the coverage report listed as "Excluded" and 39 say "Discovered- currently not indexed." When I inspect any of these URL's, it says "this page is not in the index, but not because of an error." They are listed as crawled - currently not indexed or discovered - currently not indexed.
But 3 of them are, and I updated those pages, and now those changes are reflected in Google's index. I have no idea how those 3 made it in while others didn't, or why the crawler came back and indexed the changes but continues to leave the others out.
Has anyone seen this before and know what to do?
-
Good luck!
-
Thanks Will, appreciate the insight. I'm going to get the Bing and Google wordpress plugins on there to see if that helps, build up a few more links and give it some time to wait and see. Thanks!
-
You're not the only person reporting odd indexation happenings here on Q&A (see for example this question). And, just like I found for that question, your site appears to have more pages indexed in Bing than in Google - which at least seems to point to us not having missed something obvious like meta noindex or similar.
I did also read Google saying that they had issues with the site: command (link) but I don't think that can have anything to do with your situation as they say they have now fixed that issue, and I couldn't find any other pages on your site even with non-site: searches (i.e. it does genuinely appear as though those pages are missing from the index).
While I am loathe to point just at links these days, I do wonder if in this case it is just a case of needing some more authority for the whole site before it is seen as big enough and important enough to justify more pages in the index.
-
Thanks, I've actually submitted request to be indexed multiple times over the last 3 weeks to no avail.
-
Hey Daniel. I agree with Chris. I have also noticed slow indexation recently. Might be a pain in the arse, but maybe you should request each page to be indexed individually in Search Console to add them to the high priority queue.
-
Hi Will, thanks for reaching out! No, not yet resolved. Still struggling to figure this out. I sent you a message on Facebook and Linkedin- would love to connect and try to get this figured out!
-
We keep adding blog posts almost every day, still not getting in the index for some reason. Discovered, yes. Crawled, yes. But not indexed, and no errors or anything.
-
Hi Daniel. Did you get this resolved / did it resolve itself? I'd happily take a look if you'd like if not - just let me know the URL.
-
My advice is, start listing more reviews! It will be picked up by google automaticly. You gotta be a bit more patient. New websites take awefully alot of time to be indexed.
I had a domain of 10+ years of age, replaced it's website, within one day completely reindexed. I have new domains, they can take up to weeks or even a month to be indexed. It's normal.
-
I actually got a quality link 2 weeks ago, but the blog post the link was published in still isn't indexed by Google either. The rest of his site is, just not his newest article for some reason, and it's 2 weeks old now. Another mystery...
-
It's a review website, and only 3 of my 24 reviews are indexed. All are discovered, most even crawled, but only those three in the index. And when I updated them, the search listing in Google results was updated within a few days. So they came back, are aware of the changes, but just not adding the others to the index.
And there are no affiliate links on this site at all. No spam, no links to spam, and I've attached a blog with 500+ word well written articles (about 20 so far) and none of the blog posts are indexed either.
I've never seen anything like this. The content is good, but almost none of it is getting indexed for some reason, despite being discovered and crawled.
-
Get quality links.
-
It's a new domain, no previous ownership, and no issues detected in search console for manual actions or security. There's no robots, noindex or any of that going on. They just won't index a bunch of the pages for some reason and it's very odd.
-
Perhaps the content on those 42 pages or so is alot copy content based? Or pages that really dont matter to be up in search?
-
I'd hold off worrying about it for now. I've heard many people talk about slow indexation lately. In the mean time, aside from the obvious check-for- nofollow- noindex-robots.txt suggestions, have you looked into the history of this domain? By chance was it penalized before you bought it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Monthly Refreshes Aren't Actually Needed, Right?
We get tons of emails from Network Solutions with the following text: To ensure that your website is easily found online it is important that you submit your website to the major search engines and internet directories, including: | Google™ Google Places™ Google Mobile™ Bing™ Yahoo!<sup>®</sup> Twitter<sup>®</sup> | Facebook<sup>®</sup> CitySearch<sup>®</sup> Foursquare™ Angie's List<sup>®</sup> GPS navigation MerchantCircle<sup>®</sup> | To do so, we recommend you go to each search engine and internet directories web page, locate the instructions and then complete a monthly refresh of your listing. If you would like us to complete this process for you please call us at... Everything I've ever read about modern SEO says this isn't necessary and it's just a solicitation to get people to pay them for something they don't even need. We update our social pages regularly and maintain listings on many citation sites using Moz Local (in addition to manually building citations). Can you guys confirm that this is just more spam from Network Solutions?
Intermediate & Advanced SEO | | ScottImageWorks0 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Google Indexed my Site then De-indexed a Week After
Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
Intermediate & Advanced SEO | | Travis-W
The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT! A week ago Google de-indexed almost all of those new pages. Check out this image, it kind of boggles my mind and makes me sad. http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
Does it come down to our site not having enough Domain Authority?
My client really needs an answer about how we are going to get these pages indexed.0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Google's Exact Match Algorithm Reduced Our Traffic!
Google's first Panda de-valued our Web store, www.audiobooksonline.com, and our traffic went from 2500 - 3000 (mostly organic referrals) per month to 800 - 1000. Google's under-valuing of our Web store continued to reduce our traffic to 400-500 for the past few months. From 4/5/2013 to 4/6/2013 our traffic dropped 50% more, because (I believe) of Google's "exact domain match" algorithm implementation. We were, even after Panda and up to 4/5/2013 getting a significant amount of organic traffic for search terms such as "audiobooks online," "audio books online," and "online audiobooks." We no longer get traffic for these generic keywords. What I don't understand is why a UK company, www.audiobooksonline.co.uk/, with a very similar domain name, ranks #5 for "audio books online" and #4 for "audiobooks online" while we've almost disappeared from Google rankings. By any measurement I am aware of, our site should rank higher than audiobooksonline.co.uk. Market Samurai reports for "audio books online" and "audiobooks online" shows that our Web store is significantly "stronger" than audiobooksonline.co.uk but they show up on Google's first page and we are down several pages. I also checked a few titles on audiobooksonline.co.uk and confirmed they are using the same publisher descriptions we and many other online book / audiobook merchants do = duplicate content. We have never received notice that our Web store was being penalized. Why would audiobooksonline.co.uk rank so much higher than audiobooksonline.com? Does Google treat non-USA sites different than USA sites?
Intermediate & Advanced SEO | | lbohen0 -
Does Google Index Videos onsite when using JQuery?
Hi, I'm showing my videos using jquery lightbox etc. This means that I do not have the normal YouTube "embedding" code onpage. Does anyone know if Google will somehow index my videos? Any solutions / ideas? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Does Google count links on a page or destination URLs?
Google advises that sites should have no more than around 100 links per page. I realise there is some flexibility around this which is highlighted in this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru One of Google's justifications for this guideline is that a page with several hundred links is likely to be less useful to a user. However, these days web pages are rarely 2 dimensional and usually include CSS drop--down navigation and tabs to different layers so that even though a user may only see 60 or so links, the source code actually contains hundreds of links. I.e., the page is actually very useful to a user. I think there is a concern amongst SEO's that if there are more than 100ish links on a page search engines may not follow links beyond those which may lead to indexing problems. This is a long winded way of getting round to my question which is, if there are 200 links in a page but many of these links point to the same page URL (let's say half the links are simply second ocurrences of other links on the page), will Google count 200 links on the page or 100?
Intermediate & Advanced SEO | | SureFire0