Google Webmaster tools Sitemap submitted vs indexed vs Index Status
-
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically.
Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates.
Our actual content should be around 950 pages counting all the category pages. What's going on here?
-
Bingo! My theory was correct. It was the extra // on the product pages in the site map. Once they fixed that, it went to indexing the sitemap again.
-
www, and parameters should not be an issue, robots file is ok (although waiting on the developer to change my_account and view_cart to my-account and view-cart)
On dev changes. This is a new site, and we have been struggling with some duplicate content generated by the ecommerce platform. We implemented a number of things to fix duplication issues around the same time this all started in google webmaster tools. Next and prev canonicals to the category pages and clean off session variables/refferal text, and canonicals on the product pages to clean off the session variables/referral text. Additionally the developer had a noindex tag on the product pages that we had them remove at the same time. Finally, we changed the content on the category pages from list with a grid view option to list view only and no followed the the secure account setting links like shopping cart, login etc.
I also have a number of fixes submitted to the developer for the site map, although to my knowledge it has not changed since day one. Changefreq is all messed up, it's randomly assigning this, no logic behind it, and 611 urls have // in between parameters instead of / could this be causing it? Follow my logic here, sitemap has all these pages with duplicate // in them, google hits the page, the canonicals we implemented says hey that's not it, it's / so then google ignores those pages in the sitemap. Is this it, or am I barking up the wrong tree? Any other thoughts?
-
I assume you have checked your robots.txt file and every other no index no follow robots X possibility that there is out there?
it appears like you are having issues with your web site architecture
https://www.distilled.net/blog/seo/indexation-problems-diagnosis-using-google-webmaster-tools/
I hope that is of help to you,
Thomas
-
Are there parameters being indexed? Is www and non-www getting indexed at the same time? Categories and tags being indexed? Any dev changes to the site that you know of?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Indexed But Not Submitted to Sitemap
Hi guys, In Google's webmaster tool it says that the URL has been indexed but not submitted to the sitemap. Is it necessary that the URL be submitted to the sitemap if it has already been indexed? Appreciate your help with this. Mark
Technical SEO | | marktheshark100 -
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Delete or re-submit sitemaps for new products? How often?
When I add new products (approx. 10 a month), I usually delete the old sitemap and submit a new one. Is this ok to do, or should I just re-submit it with the new info included? Also, is once a month too much?
Technical SEO | | tiffany11030 -
Website Migration - Very Technical Google "Index" Question
This is my understanding of how Google's search works, and I am unsure about one thing in specifc: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" connects to the "page directory". I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I ask is I am starting to work with a client who has a newly developed website. The old website domain and files were located on a GoDaddy account. The new websites files have completely changed location and are now hosted on a separate GoDaddy account, but the domain has remained in the same account. The client has setup domain forwarding/masking to access the files on the separate account. From what I've researched domain masking and SEO don't get along very well. Not only can you not link to specific pages, but if my above assumption is true wouldn't Google have a hard time crawling and storing each page in the cache?
Technical SEO | | reidsteven750 -
Tool to search relative vs absolute internal links
I'm preparing for a site migration from a .co.uk to a .com and I want to ensure all internal links are updated to point to the new primary domain. What tool can I use to check internal links as some are relative and others are absolute so I need to update them all to relative.
Technical SEO | | Lindsay_D0 -
Google Indexed URLs for Terms Have Changed Causing Huge SERP Drop
We haven't made any significant changes to our website, however the pages that google has indexed for our critical keywords have changed to pages that have caused our SERP to drop dramatically for those pages. In some cases, the changes make no sense at all. For example, one of our terms that used to be indexed to our homepage is now indexed to a dead category page that has nothing on it. One of our biggest terms, where we were 9th, changed and is now indexed to our FAQ. As a result, we now rank 44th. This is having a MAJOR impact on our business so any help on why this sudden change happened and what we can do to combat it is greatly appreciated.
Technical SEO | | EvergladesDirect0 -
Sitemap.xml problem in Google webmaster
Hi, My sitemap.xml is not submitting correctly in Google Webmaster. There is 697 url submitted but only 56 are in Google index. At the top of webmaster this is what it says ->>> http://www.example.com/sitemap.xml has been resubmitted. But when when I clicked status button RED X occurs. Any suggestions about this, thanks...
Technical SEO | | Socialdude0 -
Domain restructure, sitemaps and indexing
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school. The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it) To this end, I took categories like: /body/amazing-big-shoes/
Technical SEO | | magdaknight
/style/red-boots/
/technology/cyber-boots/ And rehoused all the content like so, doing it all manually with ftp: /boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/ I placed 301 redirects in the .htaccess file like so: redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/ (not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.) Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so: <url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url> And resubmitted the sitemap to Google Webmasters. This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939. Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely? I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ? PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted0