How do I know what pages of my site is not inedexed by google ?
-
Hi
I my Google webmaster tools under Crawl->sitemaps it shows 1117 pages submitted but 619 has been indexed.
Is there any way I can fined which pages are not indexed and why?
it has been like this for a while.
I also have a manual action (partial) message. "Unnatural links to your site--impacts links" and under affects says "Some incoming links"
is that the reason Google does not index some of my pages?
Thank you
Sina
-
Thank you very much for the detail answer,
Is there any way I can find when I got the Manual Actions (Partial)
there is no date
-
Hi Sina,
For your first question, make sure you have Google Webmaster Tools setup (which I gather you do) as you have received a 'low quality/spam links' message by them. I should add that dealing with an 'unnatural link profile by Google is a whole other project!) and super important to boot so get on top of that also! Open Site Explorer is a perfect place to start, to crawl the links and to profile your entire linking domain profile. From here you can begin to examine domain link profile by filtering through options to identify ones which may be causing you that warning from Google. This will need to be rectified in order to ensure solid indexing of your site pages. You will need to clean these up in order for the rest to work and be effective
Now, to look at the indexing issue you asked on. If you look to the right in Webmaster Tools once you login, on the dashboard, you will see a section called SITEMAPS (3rd on the right once you click into the domain) from the main panel. Click on the TITLE of this section from the dashboard, and you will land on the SITEMAPS report file. There is a wealth of information here from Google about the indexing health of your site.
There are 3 steps here, Google needs to have done in order to identify which to help you figure out the information you are looking for:
- Crawling
- Indexing
- Ranking (what you see in the SERP results pages using search terms or Google Operators for site review.
In order to see any results at all, you need to ensure you have a SITEMAPS.XML file built, loaded and submitted to Google. It also needs to be configured properly and have no errors for proper processing. This is the only way you will get clear snapshot of what has been indexed based on your XML file by Google. This will tell you have many pages you have indexed in their index, but not identify. If you don't have any at all, it will state it.
it's also time to look at your robots.txt and .htaccess file to ensure those are configured and installed properly. This would be another troubleshooting step, but seeing as you have a unnatural link profile, you may want to take these steps first. Ensure you don't have any of the <noindex>meta fields listed here as well site-wide.</noindex>
So, from here, once you login to Webmaster Tools (dashboard for the site you are referring to you) under SITEMAPS, you will see a section saying XXX number of pages submitted and XXX # of pages indexed along with any errors and warnings you are getting from them now in that box (link warnings will be here too!). This will give you some important informtion which you can log in an Excel file later Here is where you will most likely see that linking domain link alert from Google as well.
Now you have Google's 'indexed pages' view. Now you have to dig a little.
----- GOOGLE OPERATORS ---- Now, once you have some data from Google WebMaster Tools as mentioned above, You can now go to Google.com (or the Google index you want to see like .ca. or others) and use Google search operators to speficially see which URL's and pages have been indexed by the engine. There are a few different ones you can use below. I found a great resource below and copied in the link.
Domain search with - site: Operator
(site:google.com)
This should returns results only from the specified Domain.
So you will need to be careful if your site is with a SubDomain (or multiple SubDomains) ("www" is a SubDomain).Domain search with - inurl: Operator
(inurl:google.com)
This should return results that contain the specified Domain.
This may not be only from the site in question though! It is possible for other sites to contain your domainname in their URLs (whois.domaintools.com may have such URLs etc.)Domain search with - site: and inurl: Operators
(site:google.com inurl:google.com)
This way you limit the results to your Domain Only ... and it seems to generate more "reliable" results than the site: operator alone.Domain and Path/Query search with - site: and inurl: Operators
(site:google.com inurl:/somepath/somedirectory/)
(site:google.com inurl:?this=that&rabbits=lunch)
This way you limit the results to your Domain Only ... and focus on a specific directory/folder or set of paramters etc.Domain and FileType search with - site: and filetype: Operators
(site:google.com filetype:html)
This limits the results to those from your Domain, and to a specific type of file.
Please note - the filetype: operator may not show All of that type - it may only work for URLs that end in that type. thus if you serve content as html, but without the .html in the filename - they will not show in the results!)Domain and Path/Query search with - site:, inurl: and inurl: Operators
(site:google.com inurl:google.com inurl:/somepath/somedirectory/)
(site:google.com inurl:google.com inurl:?this=that&rabbits=lunch)
This permits you to start limiting the results to specific parts of your site if you need too.Make sure that your site pages also don't include in the section the <meta-noindex>or <meta-nofollow>tags. This would tell Google not to index or follow the pages from your site </meta-nofollow></meta-noindex>
Ensure that you have, in your .htaccess file the proper redirects for the site if you find you have duplicate content. Ensure you are 301 redirecting the non-www to www versions of your site and pages (or vice-versa), whichever you prefer to have indexed by Google to ensure clean indexing of the site. This will make sure you don't have problems indexing wide for search.
TO NOTE
---- SERVER LOG FILES ---- (Note: please make sure that you request log files) from your hosting company too. If you don't have access to server log files for hosting traffic, switch! Log and keep an eye on these as well for information for your needs. This process is not a fast or easy one and does require some work to detect. Don't get lazy. This is a crucial step to keep an eye on.
What I recommend next is starting to keep log files if you aren't already and tracking those on a weekkly pr monthly basis (which ever is easier). The reason being is once you get indexed to Google, you always want to keep an idea of what is indexed and what isn't (dropped) or de-indexed pages. This can also help identify early problems (or penalties) from Google if you see trending things happening day over day or week over week.
Hope this helps point you in the right direct. Remember don't be lazy here Exhaust all options to indentify your problems! Cheers,
Rob
-
Based on the manual action message from Google, I would guess that one of the possible reasons is that the unindexed pages have bad links pointing towards them. So Google is thinking that those pages are not "quality."
I would also check that all pages are included in your XML sitemap at a minimum and HTML sitemap (if you have the latter one). I'd also check the section of all pages to make sure that no pages are set to "noindex." Lastly, you may have duplicate content. If two pages have the exact-same text with only minor keyword-based variations, for example, then Google will often index only one of the two pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
Highly interlinking a particular page shows up in Google search results?
Hi All, As we can see the below statement from Google about internal linking: "The number of internal links pointing to a page is a signal to search engines about the relative importance of that page."
Intermediate & Advanced SEO | | vtmoz
https://support.google.com/webmasters/answer/138752?hl=en So if we interlink a page highly than other pages, will it rank on search results instead of homepage? Moreover if the page have "keyword" in URL slug...like www.website.com/keyword. Thanks0 -
Product Pages not indexed by Google
We built a website for a jewelry company some years ago, and they've recently asked for a meeting and one of the points on the agenda will be why their products pages have not been indexed. Example: http://rocks.ie/details/Infinity-Ring/7170/ I've taken a look but I can't see anything obvious that is stopping pages like the above from being indexed. It has a an 'index, follow all' tag along with a canonical tag. Am I missing something obvious here or is there any clear reason why product pages are not being indexed at all by Google? Any advice would be greatly appreciated. Update I was told 'that each of the product pages on the full site have corresponding page on mobile. They are referred to each other via cannonical / alternate tags...could be an angle as to why product pages are not being indexed.'
Intermediate & Advanced SEO | | RobbieD910 -
Google can't access/crawl my site!
Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
Intermediate & Advanced SEO | | granitgash
Granit FvhvDVR.png dKx3m1O.png0 -
Google+ Page Question
Just started some work for a new client, I created a Google+ page and a connected YouTube page, then proceeded to claim a listing for them on google places for business which automatically created another Google+ page for the business listing. What do I do in this situation? Do I delete the YouTube page and Google+ page that I originally made and then recreate them using the Google+ page that was automatically created or do I just keep both pages going? If the latter is the case, do I use the same information to populate both pages and post the same content to both pages? That doesn't seem like it would be efficient or the right way to go about handling this but I could be wrong.
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
Google is displaying my pages path instead of URLS (Pages name)
Does anyone knows why Google is displaying my pages path instead of the URL in the search results, i discoverd that while am searching using a keyword of mine then i copied the link http://www.smarttouch.me/services-saudi/web-services/web-design and found all related results are the same, could anyone one tell me why is that and is it really differs? or the URL display is more important than the Path display for SEO!
Intermediate & Advanced SEO | | ali8810 -
How to generate xml sitemape for an ecommerce site with more than 50000 pages?
Hi, I am new to the forum and struggling hard to work on xml sitemap for an ecommerce site. Site is dynamic and more that 50,000 pages (including product pages). Challenges I am facing should I opt for category wise xml sitemap? how to include new product pages (dynamically) I was wondering if there is any tool that can generate xml site map online (I mean as soon as a new page is added to the site it will pick up automatically). thanks
Intermediate & Advanced SEO | | posy0