Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long after https migration that google shows in search console new sitemap being indexed?
-
We migrated 4 days ago to https and followed best practices..
In search console now still 80% of our sitemaps appear as "pending" and among those sitemaps that were processed only less than 1% of submitted pages appear as indexed?Is this normal ?
How long does it take for google to index pages from sitemap?
Before https migration nearly all our pages were indexed and I see in the crawler stats that google has crawled a number of pages each day after migration that corresponds to number of submitted pages in sitemap.Sitemap and crawler stats show no errors.
-
thanks Stephan.
It took nearly a month for search console to display the majority of our pages in sitemap as indexed, even though pages showed up much earler in SERPs. We had it split down into 30 different sitemaps. Later we published also a sitemap index and saw a nice increase a few days later in indexed pages which may have been related.
Finally google now is indexing 88% of our sitemap.
Do you think in general that 88% is for a site of this size a somehow normal percentage or would you normally expect a higher percentage of indexed sitemap page and investigate deeper for potential pages that google may consider thin content? Navigation I can rule out as a reason. -
Did the "pending" message go away in the end? Unfortunately you're fairly limited in what you can do with this. The message likely indicates/indicated that one of the following was true:
- Google had difficulty accessing the sitemap (though you did say no errors)
- It was taking a long time to do it because of the large number of links
You could try splitting your sitemap up into several smaller ones, and using a sitemap index. Or have you done this already? By splitting it into several sitemaps, you can at least see whether some index and some don't, whether there do turn out to be issues with some of the URLs listed there, etc.
You can also prioritise the most important pages by putting them into their own sitemap (linked to from the sitemap index, of course), and submitting that one first. So at least if everything else takes longer you'll get your most important landing pages indexed.
-
Update. now 10 days passed since our migration to https and upload of sitemap, still same situation.
-
Google has been crawling all our pages during the last days. I see it in the crawling stats.
My concern is that
- majority of my sitemaps are still showing up as "pending" 3 days after I originally submited the sitemaps.
- those sitemaps that are processed show as indexed only less than 1% of my submitted pages.
We do have around 170.000 pages in our sitemap.
So I wonder wheher this is unusual or normal delay from google search console.
-
Its difficult to say. It depends on many factors like (importance of your site in Google's eyes, when they crawled your site the last time, relevance of the topic in general, etc.) BUT you can speed up the process a lot, i.e. initiate it on your own. You don't have to wait until Google recrawls your site at random. Did you know?
Go to Search Console - Crawl - Fetch as Google - Add your site's URL or URL of a particular sub page. Press Fetch
Google will recrawl that page again very quickly. When I do that with a particular page (not the entire domain) it usually takes 1-2 days at most to recrawl and index it again.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Can you disallow links via Search Console?
Hey guys, Is it possible in anyway to nofollow links via search console (not disavow) but just nofollow external links pointing to your site? Cheers.
Intermediate & Advanced SEO | | lohardiu90 -
Sitemap: unique sitemap or different sitemaps by Country
Hi guys, i have a question about sitemaps. We are doing an international site, e.x. www.offers.com for landing page and www.offers.com/br for brazil, www.offers.com/it for italy, etc... i don't if we should do an unique sitemap for all countries or separate sitemaps by country, e.x.: unique sitemap: www.offers.com/sitemap.xml - including all sitemaps www.offers.com/br/sitemap.xml - sitemap for brazil market only. Thank you
Intermediate & Advanced SEO | | thekiller990 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Create new subdomain or new site for new Niche Product?
We have an existing large site with strong, relevant traffic, including excellent SEO traffic. The company wants to launch a new business offering, specifically targeted at the "small business" segment. Because the "small business" customer is substantially different from the traditional "large corporation" customer, the company has decided to create a completely independent microsite for the "small business" market. Purely from a Marketing and Communications standpoint, this makes sense. From an SEO perspective, we have 2 options: Create the new "small business" microsite on a subdomain of the existing site, and benefit from the strong domain authority and trust of the existing site. Build the microsite on a separate domain with exact primary keyword match in the domain name. My sense is that option #1 is by far the better option in the short and long run. Am I correct? Thanks in advance!
Intermediate & Advanced SEO | | axelk0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0