Googlebot take 5 times longer to crawl each page
-
Hello All
From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms.
I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server.
Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it.
Many Thanks
Si
-
Thank you for having a look
I made no strcutural changes around the time of the issues starting.
On the third graph in GWMT yes there is was a spike on the time spent downloading at it is still a lot higher than previously. I have add an image of it below.
There were two google update about two weeks later the latest Panda and the new EMD.
Most of the content has been written by myself from my own experience etc. There are some pages that I am in the process of removing / changing that are the same as other sites.
Until 4 months ago the layout was in fixed size nested tables etc, I am just about getting my head around CSS etc., to try and drag it in the 21st century.
-
Hi.
Based on the site size (number of pages) and format (code, elements and structure) and two speed test I just run on it and a trace-route (from Austria) looks like you don't have any issues with it from a technical point of view.
One thing you need to check is still possibile is the time spent downloding a page graph (the third one) from within GWMT. Did this spiked up in the same time when crawl pages went down ?
A few other questions you should consider:
-
did you do any changes - especially structure changes around the same time you've notice the issues ?
-
are there any public google updates in the same timeframe with those changes that you've notice ?( you can check them here: http://www.seomoz.org/google-algorithm-change )
-
is your content duplicate ? (with external sources I mean - not internally)
Please don't get me wrong - i would be ok with the format of the site if it will be very old - before 2000. But the domain is from 2008 - you should get on track with new trends as far as layout, content format and web site format in general.
Hope it helps.
-
-
Hi
I am as sure as I can be but not being a full expert on these things I may have missed something technical.
I have be making changes to the site since mainly on the css layout.
The site is www.growingyourownveg.com
Thanks
-
Hi,
As far as I know a low crawl rate won't end up with bad rankings but bad rankings will end up with a lower crawl rate.
If you are sure and I mean really sure you don't have any technical issues on your side that will influence the crawl rate and possibile also rankings then you should take in consideration that maybe you do actually have a -950 filter that is causing your rankings to drop, google dosen't consider your site an authority and for thsi reason it won't crawl your site often or as often as it used to do it.
Can you share the url of the site ? Just to have a look and see if at a first glance there are any obvios reason for google to dislike your site.
Cheers !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
My pages are being crawled, but not indexed according to Search Console
According to Google Search Console, my pages are being crawled by not indexed. We use Shopify and about two weeks ago I selected that Traffic from all our domains redirects to our primary domain. So everything from www.url.com and https://url.com and so on, would all redirect to one url. Have added an attached image from Search Console. 6fzEQg8
Technical SEO | | HariOmHemp0 -
Pages Fighting Over Keywords
Hi Guys, Just after some general advice. Since manipulation of keywords through links is no longer a feasible way of ranking these days, I was wondering how people got round the issue of pages bouncing for the same keyword or Google deciding that a blog post is a better signal rather than your service page. For instance if you are doing local and national search, how do you stop the local keywords ranking for national pages, without diluting the local signals. I have some ideas:- stronger internal linking to the page review content But obviously redirects or canonical won't be a good solution as I still want these pages to exist in their own right. Regards Neil
Technical SEO | | nezona0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
How to stop crawls for product review pages? Volusion site
Hi guys, I have a new Volusion website. the template we are using has its own product review page for EVERY product i sell (1500+) When a customer purchases a product a week later they receive a link back to review the product. This link sends them to my site, but its own individual page strictly for reviewing the product. (As oppose to a page like amazon, where you review the product on the same page as the actual listing.) **This is creating countless "duplicate content" and missing "title" errors. What is the most effective way to block a bot from crawling all these pages? Via robots txt.? a meta tag? ** Here's the catch, i do not have access to every individual review page, so i think it will need to be blocked by a robot txt file? What code will i need to implement? i need to do this on my admin side for the site? Do i also have to do something on the Google analytics side to tell google about the crawl block? Note: the individual URLs for these pages end with: *****.com/ReviewNew.asp?ProductCode=458VB Can i create a block for all url's that end with /ReviewNew.asp etc. etc.? Thanks! Pardon my ignorance. Learning slowly, loving MOZ community 😃 1354bdae458d2cfe44e0a705c4ec38dd
Technical SEO | | Jerrion0 -
Crawl Diagnostics and Duplicate Page Title
SOMOZ crawl our web site and say we have no duplicate page title but Google Webmaster Tool says we have 641 duplicate page titles, Which one is right?
Technical SEO | | iskq0 -
How many pages should my site have?
Right now I think I only have 36. What is a good amount of pages to have? Any ideas on ways to add relevant pages to my site? I was thinking about starting a message board. Also, I have a free tech support chat room, and was thinking about posting the logs somewhere on the site. Does that sound like a good idea? Thanks.
Technical SEO | | eugenecomputergeeks0 -
How long will Google take to stop crawling an old URL once it has been 301 redirected
I need to do a clean-up old urls that have been redirected in sitemap and was wondering about this.
Technical SEO | | Ant-8080