Pages not indexed by Google
-
We recently deleted all the nofollow values on our website. (2 weeks ago)
The number of pages indexed by google is the same as before?
Do you have explanations for this?
website : www.probikeshop.fr
-
Good advice from Andrea and Brent.
To use multiple sitemaps, do something like this:
The main sitemap points to the other sitemap files.
You can have up to 50,000 URLs in those files.
- mine are gzipped
This one is sitemap_index.xml
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><sitemap><loc>http://yourdomain.com/writermap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/mainmap.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>
<sitemap><loc>http://yourdomain.com/201201.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap></sitemapindex><sitemap><loc>http://yourdomain.com/201202.xml.gz</loc>
<lastmod>2012-03-15</lastmod></sitemap>Here is a tip:
Google will index some of those pages and some it will not index.
If you have 5,000 urls in one sitemap and they only index 4957
you probably can't work out which 43 URLs they didn't index,
so if you make the numbers smaller, it can be easier to discover the pages they don't like.
- not easy, but easier
-
Well, there's a lot of ways to look at this - this wouldn't result in more pages indexed, so the two issues are totally separate.
If the goal is to get more pages indexed, then a site map (either XML or event a text list) uploaded to your server for Google to find can help. Or, at least that makes sure that Google is finding and indexing the pages you want them to find. Your Google Webmaster Tools account (assuming you have one) will also tell you some data.
For example, we used to have 100K+ pages; many weren't quality content I wanted to rank. Like, a PDF of a catalog ranking about the product page. So, I reduced the number of pages indexed so Google would have better, more quality content to serve to searchers.
Using Xenu or Screaming Frog is another good way to help uncover pages. Those tools crawl your site like Google would,then you can download the file and not only see all the URLs found, but also if they are 301/404/200, etc. And, Screaming Frog can crawl your site and output a XML sitemap for you (it's an easier way to make one).
I prefer SF and it's about $150 US dollars for the use - well worth it.
As for why - well, if you have a lot of pages, Google doesn't always find them. That's where a site map can help (it directs Google what to crawl). Otherwise, there could be technical issues to a bunch of pages and they aren't properly linked up or something and that could be causing the issue.
-
So according to you, it's normal if we dont have more pages indexed by Google, since we have deleted the nofollow values?
Google actually index 28,200 pages, but i'm sure we have more pages on site.
From where, could come the problem?
Thanks
-
Do you have XML sitemaps? If not this is a great way to measure what is being indexed by Google. Make sure you create multiple sitemaps based on your categories so you can track exactly which pages are not being indexed.
-
'No follow' isn't the same as a 'no index' code. No follow just tells the search engine it "should not influence the link target's ranking in the search engine's index." 'No index' is where you tell the crawler to not index the pages, then you can remove that if you at some future point want them indexed.
So, in theory, what you did wouldn't have anything to do with how many pages are indexed on your site anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would cause a sudden drop in indexed sitemap pages?
I have made no changes to my site for awhile and on 7/14 I had a 20% drop in indexed pages from the sitemap. However my total indexed pages has stayed the same. What would cause that?
Technical SEO | | EcommerceSite0 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
Google showing https:// page in search results but directing to http:// page
We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?
Technical SEO | | amiraicaew0 -
Should i index or noindex a contact page
Im wondering if i should noindex the contact page im doing SEO for a website just wondering if by noindexing the contact page would it help SEO or hurt SEO for that website
Technical SEO | | aronwp0 -
No existing pages in Google index
I have a real estate portal. I have a few categories - for example: flats, houses etc. Url of category looks like that: mydomain.com/flats/?page=1 Each category has about 30-40 pages - BUT in Google index I found url like: mydomain.com/flats/?page=1350 Can you explain it? This url contains just headline etc - but no content! (it´s just generated page by PHP) How is it possible, that Google can find and index these pages? (on the web, there are no backlinks on these pages) thanks
Technical SEO | | visibilitysk0 -
Http VS https and google crawl and indexing ?
Is it true that https pages are not crawled and indexed by Google and other search engines as well as http pages?
Technical SEO | | sherohass0 -
Google places page where is my additional information
Hi When creating Google places you can add additional information but where does this information go? its not showing up on the page when you place page? Whats the best practice when creating pages in relation to optimising them ? thanks
Technical SEO | | Bristolweb0 -
Is this 404 page indexed?
I have a URL that when searched for shows up in the Google index as the first result but does not have any title or description attached to it. When you click on the link it goes to a 404 page. Is it simply that Google is removing it from the index and is in some sort of transitional phase or could there be another reason.
Technical SEO | | bfinternet0