How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
-
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions.
As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted.
Lastly, the site was built using Squarespace and was launched the middle of August.
**Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas?
Thanks!!
-
Great answer! I noticed SS assign images to their own pages.
Is there a "best practice" for addressing this? Should I try to exclude the pages from being indexed?
-
You mentioned SS is your platform. Then it's probably an image/CDN problem.
The image will be given a page, since it will return a 302, it wont get indexed even if it's on the sitemap.
If you have a lot of images on your site, then a good chunk of page wont show as indexed in webmaster tools
Something like that
-
Hmmm. Looks like since before 9/10. Check out the attachment...
-
How long has the site been in the "submitted" status?
-
Thanks for the reply, Trung! Here's what I found...
- No errors or warnings in WMT.
- To be honest, I'm not sure how to do this one
- The 'Index Status' report in WMT says there's a total of 86 indexed pages as of 10/5/14!! Which is great, but even more confusing when I consider my original question.
- The "site:" search confirms what's stated above - "About 87 results", it says.
So, it looks like we're in the clear. I'm just not sure what the Crawl>Sitemaps>Web pages, 57 Submitted vs 5 Indexed means. Strange, huh? (I attached a pic so you can see what I'm referencing.)
Thanks again!
-
Hi Nate,
Glad to hear that the new site is a success! A few things I would check:
- An obvious one: check to see if there are any errors/warnings reported in webmaster tools for your sitemap.
- Review the URLs included in your XML sitemap to ensure that only URLs responding 200 are included (exception is if you're migrating URLs and want to included URLs responding 301 so that they're picked up by search engines faster, theoretically anyway. But you'd want to update the sitemap once the 301s are removed from the index.)
- Check the 'Index Status' report in Webmaster Tools to make sure there isn't a unanticipated decline in indexed pages.
- Do a 'site:yoursite.com' query to get a broad sense of what pages are actually included in the search results. I've found the Webmaster Tools Submitted/Indexed number to vary in terms of accuracy--not super reliable in my experience. The goal with this check up is to ensure that your site's main pages are indexed. You can include subfolders if you want to get more granular, e.g. 'site:yoursite.com/subfolder'.
- Monitor the page's sending visits from organic search in Google Analytics. This will also give you a better sense of what's indexed.
Hope this helps you get started!
-Trung
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
Any idea why pages are not being indexed?
Hi Everyone, One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one: https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console? Jeff
Technical SEO | | vetofunk1 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
What to do with "show all" page
Hello, What should I do with the following situation: In e-commerce shop I have an option to "show all products" (list all products in one page) - do I need to put canonnical or 301 redirect to somewhere or should I leave as normal page - I think google consider this is as duplicate since everything is the same (only number of products is different) ? Regards, Nenad
Technical SEO | | Uniline0 -
Getting Recrawled by Google
I have been updating my site a lot and some of the updates are showing up in Google and some are not. Is there a best practice in getting your site fully recrawled by Google?
Technical SEO | | ShootTokyo0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0