How to speed indexing of web pages after website overhaul.
-
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https.
The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console.
As we look through the google search console analytics we notice that google index section / index status it says:
https://www.xyz.com version - index status 2
www.xyz.com version - index status 37
xyz.com version - index status 8how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic.
thanks for your help in advance
-
You can simply use Fetch as Google. A Google web master tool which will easily Index you website.
-
You could always use the Fetch As Google function in Search Console. This might speed things up for you, but the fact of the matter is, there is no really quick way to get Google to index your pages. It's never as fast as we would like.
Make sure that if the site has been overhauled, that any old pages are 301'd to their new counterpart. This also means that you might need to check that every http page redirects to the https one.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Improvement in Page Speed worth Compromise on HTML Validation?
Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious? Fatal Error: Cannot recover after last error. Any further errors will be ignored. From line 699, column 9; to line 699, column 319 >↩ ↩ `OUR DEVELOPER'S COMMENT: | This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance | The domain URL is www.nyc-officespace-leader.com`
Intermediate & Advanced SEO | | Kingalan10 -
What Sources to use to compile an as comprehensive list of pages indexed in Google?
As part of a Panda recovery initiative we are trying to get an as comprehensive list of currently URLs indexed by Google as possible. Using the site:domain.com operator Google displays that approximately 21k pages are indexed. Scraping the results however ends after the listing of 240 links. Are there any other sources we could be using to make the list more comprehensive? To be clear, we are not looking for external crawlers like the SEOmoz crawl tool but sources that would be confidently allow us to determine a list of URLs currently hold in the Google index. Thank you /Thomas
Intermediate & Advanced SEO | | sp800 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120 -
Multiple Versions of Pages on One Website
Hi! My name is Sarah and I work for a brand design firm in Los Angeles. Currently we're working on a website redesign for our company. We have three pages of content that we want to add to the site, but are unsure if we will get penalized by Google if we add all of them since they may come off as too similar? The pages are: Branding
Intermediate & Advanced SEO | | Jawa
Personal Branding
Corporate Branding Does anyone know if our SEO will be penalized for having all three of these pages separately, or should we just focus on Branding, and include Personal Branding and Corporate Branding as sub categories on the page? Thanks! Sarah P.S. I should also say, we will have more than just the three aforementioned pages. It's going to be a big site with around 200+ pages. (Half of them being services, which is where the Branding, PB and CB pages will be located.)0 -
Convert keyword rich PDFs to web pages (text & images)
SteriPEN is a portable water purifier that kills viruses, protozoa, e-coli, etc. Because of the technical and safety requirements nature of the product, our website has much documentation of testing, organisms affected, and more. These are in pdf form and can often be found through google search (and through links on specific pages). Because of the keyword-richness of these documents pertaining to microbes SteriPEN kills, etc. does it make sense to convert these pdf's into html text and images? Then I was thinking perhaps writing a blog post AND generating key links on important landing pages to these documents (as html). Removing pdfs may be harmful? Not a clue as to the cost/benefit.
Intermediate & Advanced SEO | | Timmmmy0