Pages not indexed by Google
-
We recently deleted all the nofollow values on our website. (2 weeks ago)
The number of pages indexed by google is the same as before?
Do you have explanations for this?
website : www.probikeshop.fr
-
Good advice from Andrea and Brent.
To use multiple sitemaps, do something like this:
The main sitemap points to the other sitemap files.
You can have up to 50,000 URLs in those files.
- mine are gzipped
This one is sitemap_index.xml
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><sitemap><loc>http://yourdomain.com/writermap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/mainmap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/201201.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap></sitemapindex><sitemap><loc>http://yourdomain.com/201202.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>Here is a tip:
Google will index some of those pages and some it will not index.
If you have 5,000 urls in one sitemap and they only index 4957
you probably can't work out which 43 URLs they didn't index,
so if you make the numbers smaller, it can be easier to discover the pages they don't like.
- not easy, but easier
-
Well, there's a lot of ways to look at this - this wouldn't result in more pages indexed, so the two issues are totally separate.
If the goal is to get more pages indexed, then a site map (either XML or event a text list) uploaded to your server for Google to find can help. Or, at least that makes sure that Google is finding and indexing the pages you want them to find. Your Google Webmaster Tools account (assuming you have one) will also tell you some data.
For example, we used to have 100K+ pages; many weren't quality content I wanted to rank. Like, a PDF of a catalog ranking about the product page. So, I reduced the number of pages indexed so Google would have better, more quality content to serve to searchers.
Using Xenu or Screaming Frog is another good way to help uncover pages. Those tools crawl your site like Google would,then you can download the file and not only see all the URLs found, but also if they are 301/404/200, etc. And, Screaming Frog can crawl your site and output a XML sitemap for you (it's an easier way to make one).
I prefer SF and it's about $150 US dollars for the use - well worth it.
As for why - well, if you have a lot of pages, Google doesn't always find them. That's where a site map can help (it directs Google what to crawl). Otherwise, there could be technical issues to a bunch of pages and they aren't properly linked up or something and that could be causing the issue.
-
So according to you, it's normal if we dont have more pages indexed by Google, since we have deleted the nofollow values?
Google actually index 28,200 pages, but i'm sure we have more pages on site.
From where, could come the problem?
Thanks
-
Do you have XML sitemaps? If not this is a great way to measure what is being indexed by Google. Make sure you create multiple sitemaps based on your categories so you can track exactly which pages are not being indexed.
-
'No follow' isn't the same as a 'no index' code. No follow just tells the search engine it "should not influence the link target's ranking in the search engine's index." 'No index' is where you tell the crawler to not index the pages, then you can remove that if you at some future point want them indexed.
So, in theory, what you did wouldn't have anything to do with how many pages are indexed on your site anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Stuck trying to deindex pages from google
Hi There, We had developers put a lot of spammy markups in one of our websites. We tried many ways to deindex them by fixing it and requesting recrawls... However, some of the URLs that had these spammy markups were incorrect URLs - redirected to the right version, (ex. same URL with or without / at the end) so now all the regular URLs are updated and clean, however, the redirected URLs can't be found in crawls so they weren't updated, and couldn't get the spam removed. They still show up in the serp. I tried deindexing those spammed pages by making then no-index in the robot.txt file. This seemed to be working for about a week, and now they showed up again in the serp Can you help us get rid of these spammy urls? edit?usp=sharing
Technical SEO | | Ruchy0 -
Discrepancy in actual indexed pages vs search console
Hi support, I checked my search console. It said that 8344 pages from www.printcious.com/au/sitemap.xml are indexed by google. however, if i search for site:www.printcious.com/au it only returned me 79 results. See http://imgur.com/a/FUOY2 https://www.google.com/search?num=100&safe=off&biw=1366&bih=638&q=site%3Awww.printcious.com%2Fau&oq=site%3Awww.printcious.com%2Fau&gs_l=serp.3...109843.110225.0.110430.4.4.0.0.0.0.102.275.1j2.3.0....0...1c.1.64.serp..1.0.0.htlbSGrS8p8 Could you please advise why there is discrepancy? Thanks.
Technical SEO | | Printcious0 -
A few pages deindexed from Google .. PLEASE HELP!
My client has a fairly new site and we were agressively building content to the website. It is an ecommerce store and we have got a blog as well. We guest blogged in a few places and wrote 3-5 articles a day. Last few days, i noticed 3-4 pages that we were building links to got deindexed. What could be the reason? We weren't using any bots to build links, only a couple of it around 5-10 links to a page. Google WMT is not showing any messages and no manual action is seen. What could be the reason? I've submitted those URL for reindex and so far nothing seems to work. Any idea? Please help.
Technical SEO | | WayneRooney0 -
Getting Google to index a large PDF file
Hello! We have a 100+ MB PDF with multiple pages that we want Google to fully index on our server/website. First of all, is it even possible for Google to index a PDF file of this size? It's been up on our server for a few days, and my colleague did a Googlebot fetch via Webmaster Tools, but it still hasn't happened yet. My theories as to why this may not work: A) We have no actual link(s) to the pdf anywhere on our website. B) This PDF is approx 130 MB and very slow to load. I added some compression to it, but that only got it down to 105 MB. Any tips or suggestions on getting this thing indexed in Google would be appreciated. Thanks!
Technical SEO | | BBEXNinja0 -
Why Google ranks a page with Meta Robots: NO INDEX, NO FOLLOW?
Hi guys, I was playing with the new OSE when I found out a weird thing: if you Google "performing arts school london" you will see w w w . mountview . org. uk at the 3rd position. The point is that page has "Meta Robots: NO INDEX, NO FOLLOW", why Google indexed it? Here you can see the robots.txt allows Google to index the URL but not the content, in article they also say the meta robots tag will properly avoid Google from indexing the URL either. Apparently, in my case that page is the only one has the tag "NO INDEX, NO FOLLOW", but it's the home page. so I said to myself: OK, perhaps they have just changed that tag therefore Google needs time to re-crawl that page and de-index following the no index tag. How long do you think it will take to don't see that page indexed? Do you think it will effect the whole website, as I suppose if you have that tag on your home page (the root domain) you will lose a lot of links' juice - it's totally unnatural a backlinks profile without links to a root domain? Cheers, Pierpaolo
Technical SEO | | madcow780 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Best on-line tool for checking indexed pages (or just for a Mac)
Hey guys, I'm on a Mac and that's why I can't use the usual PC software for checking if my links have been indexed. Here's the deal. I ordered some guest posts. The guest poster did it for me and put my back links. Now, I want to quickly check which pages (with my backlinks) have been indexed. I have a lot of guest posts. So, I need something that can check if those pages have been indexed by Google. I need an online tool or something that will work for my Mac. Help. 🙂
Technical SEO | | VinceWicks0 -
Removing some of the indexed pages from my website
I am planning to remove some of the webpages from my website and these webpages are already indexed with search engine. Is there any way by which I need to inform search engine that these pages are no more available.
Technical SEO | | ArtiKalra0