Pages not indexed by Google
-
We recently deleted all the nofollow values on our website. (2 weeks ago)
The number of pages indexed by google is the same as before?
Do you have explanations for this?
website : www.probikeshop.fr
-
Good advice from Andrea and Brent.
To use multiple sitemaps, do something like this:
The main sitemap points to the other sitemap files.
You can have up to 50,000 URLs in those files.
- mine are gzipped
This one is sitemap_index.xml
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><sitemap><loc>http://yourdomain.com/writermap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/mainmap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/201201.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap></sitemapindex><sitemap><loc>http://yourdomain.com/201202.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>Here is a tip:
Google will index some of those pages and some it will not index.
If you have 5,000 urls in one sitemap and they only index 4957
you probably can't work out which 43 URLs they didn't index,
so if you make the numbers smaller, it can be easier to discover the pages they don't like.
- not easy, but easier
-
Well, there's a lot of ways to look at this - this wouldn't result in more pages indexed, so the two issues are totally separate.
If the goal is to get more pages indexed, then a site map (either XML or event a text list) uploaded to your server for Google to find can help. Or, at least that makes sure that Google is finding and indexing the pages you want them to find. Your Google Webmaster Tools account (assuming you have one) will also tell you some data.
For example, we used to have 100K+ pages; many weren't quality content I wanted to rank. Like, a PDF of a catalog ranking about the product page. So, I reduced the number of pages indexed so Google would have better, more quality content to serve to searchers.
Using Xenu or Screaming Frog is another good way to help uncover pages. Those tools crawl your site like Google would,then you can download the file and not only see all the URLs found, but also if they are 301/404/200, etc. And, Screaming Frog can crawl your site and output a XML sitemap for you (it's an easier way to make one).
I prefer SF and it's about $150 US dollars for the use - well worth it.
As for why - well, if you have a lot of pages, Google doesn't always find them. That's where a site map can help (it directs Google what to crawl). Otherwise, there could be technical issues to a bunch of pages and they aren't properly linked up or something and that could be causing the issue.
-
So according to you, it's normal if we dont have more pages indexed by Google, since we have deleted the nofollow values?
Google actually index 28,200 pages, but i'm sure we have more pages on site.
From where, could come the problem?
Thanks
-
Do you have XML sitemaps? If not this is a great way to measure what is being indexed by Google. Make sure you create multiple sitemaps based on your categories so you can track exactly which pages are not being indexed.
-
'No follow' isn't the same as a 'no index' code. No follow just tells the search engine it "should not influence the link target's ranking in the search engine's index." 'No index' is where you tell the crawler to not index the pages, then you can remove that if you at some future point want them indexed.
So, in theory, what you did wouldn't have anything to do with how many pages are indexed on your site anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
Why my website main page suddenly disappear from Google search?
Hello friends, I need help . I lost everything , i don't know what happen to my website (hindustanfoods.com). My site is suddenly disappear from google search .. In 2013 it was on first page on top but suddenly this i lost my position not only position i lost from any pages. It is not showing anywhere. I have hosted a temp domain which links to some of the restaurant to our main (hindustanfoods.com) website and this website is seems good on google search where it has only 4 or 5 page website and main website has good pages. Somebody help should we need to work from start...................please help me
Technical SEO | | Tufail0 -
Drastic increase of indexed pages correlated to rankings loss?
Our ecommerce website has had a drastic increase in indexed pages, and equal loss of Google organic traffic. After 10/1 the number of indexed pages jumped from 240k to 5.7 million by the end of the year, according to GWT. Coincidentally, the sitemap tops at 14,192 pages, with 13,324 indexed. Organic traffic on some top keyphrases began declining by half after 10/26 and ranking (previously placing in the top 5 spots) has dropped to the fifth page of results. This website does produce session id's (/c=) so we been blocking /c=/ in the robots.txt file. We also have a rel=canonical on all pages pointing at the correct url. With all of this in place, traffic hasn't recovered. Is there a correlation between this spike of indexed pages and the lost keyword ranking? Any advice to investigate and correct this further would be greatly appreciated. Thanks.
Technical SEO | | marketing_zoovy.com0 -
Google Indexed Only 1 Page
Hi, I'm new and hope this forum can help me. I have recently resubmit my sitemap and Google only Indexed 1 Page. I can still see many of my old indexed pages in the SERP's? I have upgraded my template and graded all my pages to A's on SEOmoz, I have solid backlinks and have been building them over time. I have redirected all my 404 errors in .htaccess and removed /index.php from my url's. I have never done this before but my website runs perfect and all my pages redirect as I hoped. My site: www.FunerallCoverFinder.co.za How do I figure out what the problem is? Thanks in Advance!
Technical SEO | | Klement690 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Google counting numbers of products on category pages - what about pagination ?
Hi there, Whilst checking out the SERPS, as you do, I noticed that where our category page appears, google now seems to be counting the number of products (what it calls items) on the product page and displaying this in the 1st part of the description (see image attached). My problem is we employ pagination, so that our category page will have 15 items on it, then there are paginated results for the rest, with either ?page=2 or page-2/ etc. appended to the URL. Although this is only a minor issue, I was just wondering if there was a way to change the number of products displayed on that page to be the entire number of products in that category, is there a microformat markup or something that can over-ride what google has detected ? Furthermore is this system of pagination effective ? I have considered using javascript pagination, such that all products would be loaded on to the one page but hidden until 'paginated', but I was worried about having hidden elements on the page, and also the impact of load times. Although I think this may solve the problem and display the true number of products in a section! Any help much appreciated, Stuart b4urme.jpg
Technical SEO | | stukerr0