Crawl efficiency - Page indexed after one minute!
-
Hey Guys,A site that has 5+ million pages indexed and 300 new pages a day.I hear a lot that sites at this level its all about efficient crawlabitliy.The pages of this site gets indexed one minute after the page is online.1) Does this mean that the site is already crawling efficient and there is not much else to do about it?2) By increasing crawlability efficiency, should I expect gogole to crawl my site less (less bandwith google takes from my site for the same amount of crawl)or to crawl my site more often?Thanks
-
This is a complicated question that I can't give a simple answer for, as every site is set-up differently and has it's own challenges. You will likely use a variety of the techniques mentioned in my last paragraph above. Good luck.
-
Thanks Anthony,
Your explanation was very helpful.
Assuming that 3 millions pages out of my 5 are not so important for google to be crawling or indexing.
What would be the best way to optimize my crawl efficiency in relation to the amount of pages?
Just <noindex>3 million pages on the site, I believe this can be a risk move.</noindex>
Perhaps robots.txt but that would not de-index the existing pages.
-
Crawl efficiency isn't exactly the same as indexation speed. It is normal for a new page to be indexed quickly, often times it is linked to from the blog home page, shared on social networks, etc.
Crawl efficiency has a lot to do with making sure your most important pages are crawled as frequently as possible. Let's use the example of your site with 5,000,000 pages indexed. Perhaps there are 100,000 of those pages that are extremely important for your website. Your top categories, all of your products, your content, etc.
Then you are left with 4,900,000 pages that are not that important, but needed for the functionality of your website (pagination, filtering, sorting, etc). You have to determine, is it a good thing that Google has 5 million pages of your site indexed? Do you want Google regularly crawling those 4,900,000 pages, potentially at the expense of your more important pages?
Next, you check your Google Webmaster Tools and see that Google is crawling about 130,000 pages/day on your site. At that rate, it would take Google 38 days (over an entire month) to crawl your entire site. Of course, it doesn't actually work that way - Google will crawl your site in a logical manor, crawling the pages with high authority (well linked to internally/externally) much more often. The point is, you can see that not all of your pages are being crawled every day. You want your best content crawled as frequently as possible.
"To be more blunt, if a page hasn't been crawled recently, it won't rank well." This quote is taken from one of my favorite resources on this topic, is this post by AJ Kohn. http://www.blindfiveyearold.com/crawl-optimization
Crawl efficiency is guiding the search spiders to your best content and helping them learn what types of pages you can ignore. You do this primarily through: Site Structure, Internal Linking, robots.txt, NoFollow attribute and Parameter Handling in Google Webmaster Tools.
-
You can actually let Google know about a new mass of pages through the sitemap. The sitemap is a single file what can be parsed to produce a large list of links.
Google can discover new pages by comparing the list of links with what they know about.
Here's an intro link that covers the sitemap: http://blog.kissmetrics.com/get-google-to-index/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Site move-Redirecting and Indexing dynamic pages
I have an interesting problem I would like to pick someone else’s brain. Our business has over 80 different products, each with a dedicated page (specs, gallery, copy etc.) on the main website. Main site itself, is used for presentation purpose only and doesn’t offer a direct path to purchase. A few years ago, to serve a specific customer segment, we have created a site where customers can perform a quick purchase via one of our major strategic partners. Now we are looking to migrate this old legacy service, site and all its pages under the new umbrella (main domain/CMS). Problem #1 Redirects/ relevancy/ SEO equity Ideally, we could simply perform 1:1 - 301 redirect from old legacy product pages to the relevant new site products pages. The problem is that Call to action (buy), some images and in some cases, parts of the copy must be changed to some degree to accommodate this segment. The second problem is in our dev and creative team. There are not enough resources to dedicate for the creation of the new pages so we can perform 1:1 301 redirects. So, the potential decision is to redirect a visitor to the dynamic page URL where parent product page will be used to apply personalization rules and a new page with dynamic content (buy button, different gallery etc.) is displayed to the user (see attached diagram). If we redirect directly to parent URL and then apply personalization rules, URL will stay the same and this is what we are trying to avoid (we must mention in the URL that user is on purchase path, otherwise this redirect and page where the user lands, can be seen as deceptive). Also Dynamic pages will have static URLs and unique page/title tag and meta description. Problem #2 : Indexation/Canonicalization The dynamic page is canonicalized to the parent page and does have nearly identical content/look and feel, but both serve a different purpose and we want both indexed in search. Hope my explanation is clear and someone can chip in. Any input is greatly appreciated! vCm2Dt.jpg
Intermediate & Advanced SEO | | bgvsiteadmin1 -
Is there a way to no index no follow sections on a page to avoid duplicative text issues?
I'm working on an event-related site where every blog post starts with an introductory header about the event and then a Call To Action at the end which gives info about the Registration Deadline. I'm wondering if there is something we can and should do to avoid duplicative content penalties. Should these go in a widget or is there some way to No Index, No Follow a section of text? Thanks!
Intermediate & Advanced SEO | | Spiral_Marketing0 -
Drop in Indexed pages
Hope everyone is having an Awesome December! I first noticed a drop in my index in the beginnings of November. My site drop in indexed pages from 1400 to 600 in the past 3-4 weeks. I don't know the cause of it, and would like the community to help me figure out why my indexing has dropped. Thank you for taking time out of your schedule to read this.
Intermediate & Advanced SEO | | BSC0 -
Trouble Indexing one of our sitemaps
Hi everyone thanks for your help. Any feedback is appreciated. We have three separate sitemaps: blog/sitemap.xml events.xml sitemap.xml Unfortunately we keep trying to get our events sitemap to pickup and it just isn't happening for us. Any input on what could be going on?
Intermediate & Advanced SEO | | TicketCity0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
If I only Link to Page via Sitemap, can it still get indexed?
Hi there! I am creating a ton of content for specific geographies. Is it possible for these pages to get indexed if I only put them in my sitemap and don't link to them through my actual site (though the pages will be live). Thanks!
Intermediate & Advanced SEO | | Travis-W
Travis0 -
E Commerce product page canonical and indexing + URL parameters
Hi, I'm having some issues on the best way to handle site structure. The technical side of SEO isn't my strong point so I thought I'd ask the question before I make the decision. Two examples for you to look at. This is a new site http://www.tester.co.uk/electrical/multimeters/digital. By selecting another page to see more products you get this url string where/p/2. This page also has the canonical tag relating to this page and not the original page. Now if say for example I exclude this parameter (where) in webmaster tools will I be stopping Google indexing the products on the other pages where/p/2, 3, 4 etc. and the same if I make the canonical point to multimeters/digital/ instead of multimeters/digital/where/p/2 etc.? I have the same question applied to the older site http://www.pat-services.co.uk/digital-multimeters-26.html. which no longer has an canonical tags at all. The only real difference is Google is indexing http://www.pat-services.co.uk/digital-multimeters-26.html?page=2 but not http://www.tester.co.uk/electrical/multimeters/digital/where/p/2 Thanks for help in advance
Intermediate & Advanced SEO | | PASSLtd0