How to Index Faster?
-
Hello,
I have a new website and updated fresh content regularly. My indexing status is very slow.
When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing.
Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus.
Well the above comments are from the year of 2012.
I'm curious to know is there any new technique or methods are used to improve indexing rate?
Need your suggestions!
Thanks.
-
As mentioned earlier, you can create a comprehensive Sitemap.xml file and resubmit to Google. Give me your email address, I will create one for you and send it
Best regards,
Devanur Rafi
-
The site has about 140 articles as of now. So there are 140 post URLs. This apart, there are many category, tags, author pages. So the number should be 230 URLs as indicated by Google. However, Google may show only post pages no a priority. This is just my hunch. Thanks,
-
Nevermind, there are about 497 pages on your website. So, I would ask you to use a comprehensive sitemap.xml file and submit it to Google. You can use a tool like GSite crawler to generate one.
-
Thanks for your help, Kavit
the website's URL is www.getyourtips.com
As per the results shown by PageSpeed test on Google, the website scores 69 out of 100 on mobile devices and 79 out of 100 on desktop PCs.
https://www.dropbox.com/s/xmfw6zci8hih3sn/Screenshot 2014-06-10 14.48.09.png
While there's still room for improvement, I think the scores aren't that bad at the moment? Please, take a look at the site and let me know if I could change some properties to make it load even faster.
This is a brand new website (just 2 month old) and it might take a while to establish authority or get some PageRank. Considering these, what should be the best foot forward?
Thanks,
-
Thanks for the info Susanta. After a quick check, I see the sitemap.xml that you submitted has only 130 URLs in it and there are 230 pages (omitted after 182) indexed in Google from your website. So, can you tell me how many pages are there in total in your website?
-
Hi Devanur,
Thanks for the help! Here's the URL of my website www.getyourtips.com
Here's a screenshot of the index status of the website on GWT: https://www.dropbox.com/s/nyrwjvvmt0novu7/Screenshot 2014-06-10 14.32.12.png
Here's a screenshot of the search queries page: https://www.dropbox.com/s/2js1oqtrtseg56g/Screenshot 2014-06-10 14.38.30.png
The website is a little over 2 months old.
We have submitted the Sitemap already.
We used to post 4 posts per day during the launch of the website but we have reduced the posting to just one per day since last week.
Would like to know if there is anything more we should do apart from submitting sites to different blog feeds to get the index more frequently.
Thanks,
-
Hi Susanta,
If I was you, I would focus on site performance. Having a fast responsive site means you can expose more of your site & contents to Google during each crawl.
If you have all this great content, it might be a matter of making the site faster so that Google can index more each time.
Crawl rates and indexation are largely affected by your PageRank and Authority with Google so this does get better with time as your site improves in these areas.
-
Hi Susanta,
If you can share the URL of your domain, we will be able to answer better as the rate of indexing differs from site to site and also depends on lot of things like the website internal linking structure, website architecture, your backlinks and the list goes on..
Sitemap.xml is definitely a good way to give the indexing rate of your website a boost. But the story does not end there as there are lot of things that come in to play.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index an URL without directly linking it?
Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
Http and https protocols being indexed for e-commerce website
Hi team, Our new e-commerce website has launched and I've noticed both http and https protocols are being indexed. www.mountainjade.co.nz Our old website was http with only the necessary pages running https (cart, checkout etc). No https pages were indexed and you couldn't access a https page if you manually typed it into the browser. We outrank our competition by a mile, so I'm treading carefully here and don't want to undo the progress we made on the old site, so I have a few questions: 1. How exactly do we remove one protocol from the index? We are running on Drupal. We tried a hard redirect from https to http and excluded the relevant pages (cart, login etc from the redirect), but found that you could still access https pages if you we're in the cart (https) and then pressed back on the browser button for example. At that point you could browse the entire site on https. 2. Is the safer option to emulate what we had in place on the old website e.g http with only the necessary pages being https, rather than making the switch to sitewide https? I've been struggling with this one, so any help would be much appreciated. Jake S
Intermediate & Advanced SEO | | Jacobsheehan0 -
Google Search Console - Indexed Pages
I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?
Intermediate & Advanced SEO | | richdan0 -
Better to 301 or de-index 403 pages
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Anchor text penalties and indexed links
Hi! I'm working on a site that got hit by a manual penalty some time ago. I got that removed, cleaned up a bunch of links and disavowed the rest. That was about six months ago. Rankings improved, but the big money terms still aren't doing great. I recently ran a Searchmetrics anchor text report though, and it said that direct match anchors still made up the largest part of the overall portfolio. However, when I started looking at individual links with direct anchors, nearly every one had been removed or disavowed. My question is, could an anchor text penalty be in place because these removed links have not been reindexed? If so, what are my options? We've waited for this to happen naturally, but it hasn't occurred after quite a few months. I could ping them - could this have any impact? Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Do I need to re-index the page after editing URL?
Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Recovering from index problem
Hi all. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (<cite>thewilddeckcompany.co.uk/index.php?id=13</cite>) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. Any ideas? I'm stumped!
Intermediate & Advanced SEO | | Blink-SEO0 -
Technical Automated Content - Indexing & Value
One of my clients provides some Financial Analysis tools, which generate automated content on a daily basis for a set of financial derivatives. Basically they try to estimate through technical means weather a particular share price is going up or down, during the day as well as their support and resistance levels. These tools are fairly popular with the visitors, however I'm not sure on the 'quality' of the content from a Google Perspective. They keep an archive of these tools which tally up to nearly a 100 thousand pages, what bothers me particularly is that the content in between each of these varies only slightly. Textually there are maybe up to 10-20 different phrases which describe the move for the day, however the page structure is otherwise similar, except for the Values which are thought to be reached on a daily basis. They believe that it could be useful for users to be able to access back-dated information to be able to see what happened in the past. The main issue is however that there is currently no back-links at all to any of these pages and I assume Google could deem these to be 'shallow' provide little content which as time passes become irrelevant. And I'm not sure if this could cause a duplicate content issue; however they already add a Date in the Title Tags, and in the content to differentiate. I am not sure how I should handle these pages; is it possible to have Google prioritize the 'daily' published one. Say If I published one today; if I had to search "Derivative Analysis" I would see the one which is dated today rather then the 'list-view' or any other older analysis.
Intermediate & Advanced SEO | | jonmifsud0