Only half of the sitemap is indexed
-
I have a website with high domain authority and high quality content and blog. I've resubmitted the sitemap half a dozen times. Search console getr half way through and then stops. Does anyone know any reason for this?
I've seen the usual responses of 'google is not obligated to crawl you' but this site has been fully crawled in the past. It's very odd
Does anyone have any ideas why it might stop half way - or does anyone know a testing tool that might illuminate the situation?
-
Hi Andrew
Here a few things to check or rule out:
-
Are those pages accessible to be crawled (not blocked with robots.txt etc)
-
Are they also internally linked? (ie;s crawl with Screaming Frog, starting at the homepage and see if they turn up)
-
Is the page actually indexed (search the URL in Google) but just not showing up in Search Console?
-
How long are you waiting before resubmitting - also does it literally get half way down the list, or do you mean 50% are not indexed?
Overall, I would just submit the sitemap and you don't need to keep resubmitting. I would rather do some crosschecks to make sure the URL is accessible (crawlable) and even maybe indexed already, just not showing in the report. Usually, there's some other issue with the URL besides a sitemap issue - and like you mentioned, I'm not sure how long you're waiting, but it can indeed take weeks for them to show up.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
How To Index Backlinks Easily?
I have already pinged my backlinks, While pinging individual urls but all the same backlinks are not indexed. How to index my backlinks?
Algorithm Updates | | surabhi60 -
How do you get photo galleries indexed on Google News?
I work for a news site and some of our photo galleries get indexed by Google News while others never do. I'm trying to determine why some are more successful than others even though they all follow the same guidelines regarding keyword-rich headlines & copy, h1s, etc. When comparing what's been indexed in the past with current galleries, there doesn't appear to be an obvious pattern. Can anyone share some insight into this?
Algorithm Updates | | BostonWright0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0 -
Why google index ip address instead of the domain name?
I have a website ,now google index ip address of it instead of the domain name,I have used 301 redirected to the domain name,but how to change the index IP to its domain name? And why google index the IP address?
Algorithm Updates | | frankfans1170