Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long after https migration that google shows in search console new sitemap being indexed?
-
We migrated 4 days ago to https and followed best practices..
In search console now still 80% of our sitemaps appear as "pending" and among those sitemaps that were processed only less than 1% of submitted pages appear as indexed?Is this normal ?
How long does it take for google to index pages from sitemap?
Before https migration nearly all our pages were indexed and I see in the crawler stats that google has crawled a number of pages each day after migration that corresponds to number of submitted pages in sitemap.Sitemap and crawler stats show no errors.
-
thanks Stephan.
It took nearly a month for search console to display the majority of our pages in sitemap as indexed, even though pages showed up much earler in SERPs. We had it split down into 30 different sitemaps. Later we published also a sitemap index and saw a nice increase a few days later in indexed pages which may have been related.
Finally google now is indexing 88% of our sitemap.
Do you think in general that 88% is for a site of this size a somehow normal percentage or would you normally expect a higher percentage of indexed sitemap page and investigate deeper for potential pages that google may consider thin content? Navigation I can rule out as a reason. -
Did the "pending" message go away in the end? Unfortunately you're fairly limited in what you can do with this. The message likely indicates/indicated that one of the following was true:
- Google had difficulty accessing the sitemap (though you did say no errors)
- It was taking a long time to do it because of the large number of links
You could try splitting your sitemap up into several smaller ones, and using a sitemap index. Or have you done this already? By splitting it into several sitemaps, you can at least see whether some index and some don't, whether there do turn out to be issues with some of the URLs listed there, etc.
You can also prioritise the most important pages by putting them into their own sitemap (linked to from the sitemap index, of course), and submitting that one first. So at least if everything else takes longer you'll get your most important landing pages indexed.
-
Update. now 10 days passed since our migration to https and upload of sitemap, still same situation.
-
Google has been crawling all our pages during the last days. I see it in the crawling stats.
My concern is that
- majority of my sitemaps are still showing up as "pending" 3 days after I originally submited the sitemaps.
- those sitemaps that are processed show as indexed only less than 1% of my submitted pages.
We do have around 170.000 pages in our sitemap.
So I wonder wheher this is unusual or normal delay from google search console.
-
Its difficult to say. It depends on many factors like (importance of your site in Google's eyes, when they crawled your site the last time, relevance of the topic in general, etc.) BUT you can speed up the process a lot, i.e. initiate it on your own. You don't have to wait until Google recrawls your site at random. Did you know?
Go to Search Console - Crawl - Fetch as Google - Add your site's URL or URL of a particular sub page. Press Fetch
Google will recrawl that page again very quickly. When I do that with a particular page (not the entire domain) it usually takes 1-2 days at most to recrawl and index it again.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Image Search - Is there a way to influence the related icons at the top of the image search results?
Google recently added related icons at the top of the image search results page. Some of the icons may be unrelated to the search. Are there any best practices to influence what is positioned in the related image icons section? Thank you.
Intermediate & Advanced SEO | | JaredBroussard1 -
Search console validation taking a long time?
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing. Thank you! ^_^
Intermediate & Advanced SEO | | angelamaemae0 -
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
How to avoid Google penalties being inherited when moving on with a new domain?
Looking for SEOs who have experience with resetting projects by migrating on to a new domain to shed either a manual or algorithmic penalty. My questions are: For algorithmic penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? For manual penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? Any other input on these kind of reset projects is appreciated.
Intermediate & Advanced SEO | | spanish_socapro0 -
Google indexed wrong pages of my website.
When I google site:www.ayurjeewan.com, after 8 pages, google shows Slider and shop pages. Which I don't want to be indexed. How can I get rid of these pages?
Intermediate & Advanced SEO | | bondhoward0 -
How long does it take for google to update title and metadata?
I updated my site's title and description about a week ago, however for some reason it's still not reflected in google search results. Here's my site and try searching for 'shopious directory'. Any idea why this is? I tried looking at webmaster tool and it seems that google didn't have any errors. Why is it still showing the old data?
Intermediate & Advanced SEO | | herlamba0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0