How to increase the crawl rate?
-
hello,
Our site was hosted in North America and Google was crawling it reasonably fast. Since our traffic is mostly from India we moved it to India, now the crawling is terribly slow from Google. Is there anyway to fix the crawl rate(we have increased the crawl rate in GWT)
-
We have already done that in Webmaster tools but there is no change in the indexing rate. We would like to know if the hosting change from US to India is affecting us and if yes is there a way to fix it.
-
You can change crawl rate in Webmaster tools:
http://www.google.com/support/webmasters/bin/answer.py?answer=48620
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webmaster is not crawling links and site cache still in old date
Hi guys, I have been trying to get my page indexed in Google with new title and descriptions but it is not getting indexed. I have checked in many tools but no useful. Can you please tell me what could be the issue? Even I have set up And Google webmaster is not crawling links I have built so far. Few links are indexed but others do not. Why this is happening. My url is: https://www.paydaysunny.com thanks
Technical SEO | | ksmith880 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
Our stage site got crawled and we got an unnatural inbound links warning. What now?
live site: www.mybarnwoodframes.com stage site: www.methodseo.net We recently finished a redesign of our site to improve our navigation. Our developer insisted on hosting the stage site on her own server with a separate domain while she worked on it. However, somebody left the site turned on one day and Google crawled the entire thing. Now we have 4,320 pages of 100% identical duplicate content with this other site. We were upset but didn't think that it would have any serious repercussions until we got two orders from customers from the stage site one day. Turns out that the second site was ranking pretty decently for a duplicate site with 0 links, the worst was yet to come however. During the 3 months of the redesign our rankings on our live site dropped and we suffered a 60% drop in organic search traffic. On May 22, 2013 day of the Penguin 2.0 release we received an unnatural inbound links warning. Google webmaster tools shows 4,320 of our 8,000 links coming from the stage site domain to our live site, we figure that was the cause of the warning. We finished the redesign around May 14th and we took down the stage site, but it is still showing up in the search results and the 4,320 links are still showing up in our webmaster tools. 1. Are we correct to assume that it was the stage site that caused the unnatural links warning? 2. Do you think that it was the stage site that caused the drop in traffic? After doing a link audit I can't find any large amount of horrendously bad links coming to the site. 3. Now that the stage site has been taken down, how do we get it out of Google's indexes? Will it be taken out over time or do we need to do something on our end for it to be delisted? 4. Once it's delisted the links coming from it should go away, in the meantime however, should we disavow all of the links from the stage site? Do we need to file a reconsideration request or should we just be patient and let them go away naturally? 5. Do you think that our rankings will ever recover?
Technical SEO | | gallreddy0 -
Seomoz Can not Crawl My Site
Hello there Seomoz can not crawl my site. It's been 3 days now not a single page has been crawled. I deleted the campaign and tried again still now crawl not a single page.. Any solutions??
Technical SEO | | ExpertSolutions0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1 -
At what point is the canonical tag crawled
Do search engines (specifically Google) crawl the url in the canonical tag as it loads or do they load the whole page before crawling it? Thanks,
Technical SEO | | ao.com0 -
Reading Crawl Diagnostics and Taking Action on results
My site crawl diagnostics are showing a high number of duplicate page titles and content. When i look at the flagged pages, many errors are simply listed from multiple pages of product category search results. This looks pretty normal to me and I am at a loss for understanding how to fix this situation. Can I talk with someone? thanks, Gary
Technical SEO | | GaryQ0