Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long after https migration that google shows in search console new sitemap being indexed?
-
We migrated 4 days ago to https and followed best practices..
In search console now still 80% of our sitemaps appear as "pending" and among those sitemaps that were processed only less than 1% of submitted pages appear as indexed?Is this normal ?
How long does it take for google to index pages from sitemap?
Before https migration nearly all our pages were indexed and I see in the crawler stats that google has crawled a number of pages each day after migration that corresponds to number of submitted pages in sitemap.Sitemap and crawler stats show no errors.
-
thanks Stephan.
It took nearly a month for search console to display the majority of our pages in sitemap as indexed, even though pages showed up much earler in SERPs. We had it split down into 30 different sitemaps. Later we published also a sitemap index and saw a nice increase a few days later in indexed pages which may have been related.
Finally google now is indexing 88% of our sitemap.
Do you think in general that 88% is for a site of this size a somehow normal percentage or would you normally expect a higher percentage of indexed sitemap page and investigate deeper for potential pages that google may consider thin content? Navigation I can rule out as a reason. -
Did the "pending" message go away in the end? Unfortunately you're fairly limited in what you can do with this. The message likely indicates/indicated that one of the following was true:
- Google had difficulty accessing the sitemap (though you did say no errors)
- It was taking a long time to do it because of the large number of links
You could try splitting your sitemap up into several smaller ones, and using a sitemap index. Or have you done this already? By splitting it into several sitemaps, you can at least see whether some index and some don't, whether there do turn out to be issues with some of the URLs listed there, etc.
You can also prioritise the most important pages by putting them into their own sitemap (linked to from the sitemap index, of course), and submitting that one first. So at least if everything else takes longer you'll get your most important landing pages indexed.
-
Update. now 10 days passed since our migration to https and upload of sitemap, still same situation.
-
Google has been crawling all our pages during the last days. I see it in the crawling stats.
My concern is that
- majority of my sitemaps are still showing up as "pending" 3 days after I originally submited the sitemaps.
- those sitemaps that are processed show as indexed only less than 1% of my submitted pages.
We do have around 170.000 pages in our sitemap.
So I wonder wheher this is unusual or normal delay from google search console.
-
Its difficult to say. It depends on many factors like (importance of your site in Google's eyes, when they crawled your site the last time, relevance of the topic in general, etc.) BUT you can speed up the process a lot, i.e. initiate it on your own. You don't have to wait until Google recrawls your site at random. Did you know?
Go to Search Console - Crawl - Fetch as Google - Add your site's URL or URL of a particular sub page. Press Fetch
Google will recrawl that page again very quickly. When I do that with a particular page (not the entire domain) it usually takes 1-2 days at most to recrawl and index it again.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Image Search - Is there a way to influence the related icons at the top of the image search results?
Google recently added related icons at the top of the image search results page. Some of the icons may be unrelated to the search. Are there any best practices to influence what is positioned in the related image icons section? Thank you.
Intermediate & Advanced SEO | | JaredBroussard1 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
How to find all indexed pages in Google?
Hi, We have an ecommerce site with around 4000 real pages. But our index count is at 47,000 pages in Google Webmaster Tools. How can I get a list of all pages indexed of our domain? trying to locate the duplicate content. Doing a "site:www.mydomain.com" only returns up to 676 results... Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Why is Google Displaying this image in the search results?
Hi i'm looking at advice on how to remove or change a particular image Google is displaying in the search results. I have attached a screenshot. From the first look of it, i assumed the image would be related and be on the dealers Google+ Local Page: https://plus.google.com/118099386834104087122/about?hl=en But there are no photos. The image seems to be coming from the website. Is there a way to stop Google from displaying this image or making them display a totally different image. Thanks, Chris XzfsnUy.png
Intermediate & Advanced SEO | | Mattcarter080 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0