Google Webmaster tools Sitemap submitted vs indexed vs Index Status
-
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically.
Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates.
Our actual content should be around 950 pages counting all the category pages. What's going on here?
-
Bingo! My theory was correct. It was the extra // on the product pages in the site map. Once they fixed that, it went to indexing the sitemap again.
-
www, and parameters should not be an issue, robots file is ok (although waiting on the developer to change my_account and view_cart to my-account and view-cart)
On dev changes. This is a new site, and we have been struggling with some duplicate content generated by the ecommerce platform. We implemented a number of things to fix duplication issues around the same time this all started in google webmaster tools. Next and prev canonicals to the category pages and clean off session variables/refferal text, and canonicals on the product pages to clean off the session variables/referral text. Additionally the developer had a noindex tag on the product pages that we had them remove at the same time. Finally, we changed the content on the category pages from list with a grid view option to list view only and no followed the the secure account setting links like shopping cart, login etc.
I also have a number of fixes submitted to the developer for the site map, although to my knowledge it has not changed since day one. Changefreq is all messed up, it's randomly assigning this, no logic behind it, and 611 urls have // in between parameters instead of / could this be causing it? Follow my logic here, sitemap has all these pages with duplicate // in them, google hits the page, the canonicals we implemented says hey that's not it, it's / so then google ignores those pages in the sitemap. Is this it, or am I barking up the wrong tree? Any other thoughts?
-
I assume you have checked your robots.txt file and every other no index no follow robots X possibility that there is out there?
it appears like you are having issues with your web site architecture
https://www.distilled.net/blog/seo/indexation-problems-diagnosis-using-google-webmaster-tools/
I hope that is of help to you,
Thomas
-
Are there parameters being indexed? Is www and non-www getting indexed at the same time? Categories and tags being indexed? Any dev changes to the site that you know of?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
Sitemap errors have disappeared from my Google Webmaster tools
Hi all, A week ago I had 66 sitemap errors related to href langs in my GWT. Now, all the errors are gone, and it shows no errors. We have not done any work to fix the errors. I wonder if anybody has experienced the same thing, of Google suddenly changing the criteria or the way they report on errors in Google Webmaster Tools. I would appreciate any insights from the community! Best regards Peru
Technical SEO | | SMVSEO0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Preview not available in SERPS & Google Webmaster
Hi, I have a question regarding Google - for a site I am working on I cannot see Instant Preview, in my SERPS and also in Google Webmaster there is no blocked robot txt file and I can't figure out why I have screenshots for all my other sites? If anyone can help much obliged. L This is the site http://apexgenerators.co.uk/
Technical SEO | | lauratagdigital0 -
Disappeared from Google with in 2 hours of webmaster tools error
Hey Guys I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it he made the below changes and within 2 hours the site has drop off the face of google “in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed” “I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools” I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex do you guys have any further advice ? Ben
Technical SEO | | elbeno1 -
Mobile Google Not Indexing Mobile Website
Google currently does not index our mobile website. It has the WWW website in it's index. When a user from a mobile phone clicks on a mobile search result for WWW we redirect them to our mobile website. This is posing problems for us as our mobile website is a fraction of the # of pages/sections as our WWW. So for example, mobile search results show that we have a "careers" section; but that's not the case for the mobile website. As a result a user gets a 404. How do we force mobile Google to index our mobile website instead of our WWW?
Technical SEO | | RBA0 -
Why do I see dramatic differences in impressions between Google Webmaster Tools and Google Insights for Search?
Has anyone else noticed discrepancies between these tools? Take keyword A and keyword B. I've literally seen situations where A has 3 or 4 times the traffic as B in Google Webmaster Tools, but half the traffic of B in Google Insights for Search. What might be the reason for this discrepancy?
Technical SEO | | ir-seo-account0