How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
-
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions.
As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted.
Lastly, the site was built using Squarespace and was launched the middle of August.
**Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas?
Thanks!!
-
Great answer! I noticed SS assign images to their own pages.
Is there a "best practice" for addressing this? Should I try to exclude the pages from being indexed?
-
You mentioned SS is your platform. Then it's probably an image/CDN problem.
The image will be given a page, since it will return a 302, it wont get indexed even if it's on the sitemap.
If you have a lot of images on your site, then a good chunk of page wont show as indexed in webmaster tools
Something like that
-
Hmmm. Looks like since before 9/10. Check out the attachment...
-
How long has the site been in the "submitted" status?
-
Thanks for the reply, Trung! Here's what I found...
- No errors or warnings in WMT.
- To be honest, I'm not sure how to do this one
- The 'Index Status' report in WMT says there's a total of 86 indexed pages as of 10/5/14!! Which is great, but even more confusing when I consider my original question.
- The "site:" search confirms what's stated above - "About 87 results", it says.
So, it looks like we're in the clear. I'm just not sure what the Crawl>Sitemaps>Web pages, 57 Submitted vs 5 Indexed means. Strange, huh? (I attached a pic so you can see what I'm referencing.)
Thanks again!
-
Hi Nate,
Glad to hear that the new site is a success! A few things I would check:
- An obvious one: check to see if there are any errors/warnings reported in webmaster tools for your sitemap.
- Review the URLs included in your XML sitemap to ensure that only URLs responding 200 are included (exception is if you're migrating URLs and want to included URLs responding 301 so that they're picked up by search engines faster, theoretically anyway. But you'd want to update the sitemap once the 301s are removed from the index.)
- Check the 'Index Status' report in Webmaster Tools to make sure there isn't a unanticipated decline in indexed pages.
- Do a 'site:yoursite.com' query to get a broad sense of what pages are actually included in the search results. I've found the Webmaster Tools Submitted/Indexed number to vary in terms of accuracy--not super reliable in my experience. The goal with this check up is to ensure that your site's main pages are indexed. You can include subfolders if you want to get more granular, e.g. 'site:yoursite.com/subfolder'.
- Monitor the page's sending visits from organic search in Google Analytics. This will also give you a better sense of what's indexed.
Hope this helps you get started!
-Trung
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google. Does anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
Technical SEO | | Chophel
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
"Cookies are required to access this site" in Google Serp?
One of my clients is having an issue where their Google search result title and description are just showing "Cookies are required to access this site." instead of the actual meta values. The problem is only in Google as Yahoo and Bing seem to be fine. You can see in the image below or by running a search your self for "be well bodyworks longmont" I've never seen anything like it and couldn't find any reference to anyone else having this issue... I would very much appreciate any insight as to what is going on. Thanks! c5PGL
Technical SEO | | CampfireDigital0 -
Google ignores Meta name="Robots"
Ciao from 24 degrees C wetherby UK, On this page http://www.perspex.co.uk/products/palopaque-cladding/ this line was added to block indexing: But it has not worked, when you google "Palopaque PVC Wall Cladding" the page appears in the SERPS. I'm going to upload a robots txt file in a second attempt to block indexing but my question is please:
Technical SEO | | Nightwing
Why is it being indexed? Grazie,
David0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Google Not Indexed WWW name
Here is my domain - http://www.plugnbuy.com . When i see through "site" google not showing with WWW index but the same when i do without WWW.. it is showing in search. So yesturday i changed the setting from GWM to preferred domain as a WWW appear but today still not showing anything... Please help..
Technical SEO | | mamuti0 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0