If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt?
- Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots?
- Is there any benefit to listing the index sitemaps in robots if we’re pinging?
- If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod>
- Do we need to all to cover our bases?
thanks
Marika