Googlebot take 5 times longer to crawl each page
-
Hello All
From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms.
I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server.
Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it.
Many Thanks
Si
-
Thank you for having a look
I made no strcutural changes around the time of the issues starting.
On the third graph in GWMT yes there is was a spike on the time spent downloading at it is still a lot higher than previously. I have add an image of it below.
There were two google update about two weeks later the latest Panda and the new EMD.
Most of the content has been written by myself from my own experience etc. There are some pages that I am in the process of removing / changing that are the same as other sites.
Until 4 months ago the layout was in fixed size nested tables etc, I am just about getting my head around CSS etc., to try and drag it in the 21st century.
-
Hi.
Based on the site size (number of pages) and format (code, elements and structure) and two speed test I just run on it and a trace-route (from Austria) looks like you don't have any issues with it from a technical point of view.
One thing you need to check is still possibile is the time spent downloding a page graph (the third one) from within GWMT. Did this spiked up in the same time when crawl pages went down ?
A few other questions you should consider:
-
did you do any changes - especially structure changes around the same time you've notice the issues ?
-
are there any public google updates in the same timeframe with those changes that you've notice ?( you can check them here: http://www.seomoz.org/google-algorithm-change )
-
is your content duplicate ? (with external sources I mean - not internally)
Please don't get me wrong - i would be ok with the format of the site if it will be very old - before 2000. But the domain is from 2008 - you should get on track with new trends as far as layout, content format and web site format in general.
Hope it helps.
-
-
Hi
I am as sure as I can be but not being a full expert on these things I may have missed something technical.
I have be making changes to the site since mainly on the css layout.
The site is www.growingyourownveg.com
Thanks
-
Hi,
As far as I know a low crawl rate won't end up with bad rankings but bad rankings will end up with a lower crawl rate.
If you are sure and I mean really sure you don't have any technical issues on your side that will influence the crawl rate and possibile also rankings then you should take in consideration that maybe you do actually have a -950 filter that is causing your rankings to drop, google dosen't consider your site an authority and for thsi reason it won't crawl your site often or as often as it used to do it.
Can you share the url of the site ? Just to have a look and see if at a first glance there are any obvios reason for google to dislike your site.
Cheers !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Low page impressions
Hey there MOZ Geniuses; While checking my webmaster data I noticed that almost all my Google impressions are generated by the home page, most other content pages are showing virtually no impression data <50 (the home page is showing around 1500 - a couple of the pages are in the 150-200 range). the site has been up for about 8 months now. Traffic on average is about 500 visitors, but I'm seeing very little entry other then the home page. Checking the number Sitemap section 27 of 30 are index Webmaster tools are not reporting errors Webmaster keyword impressions are also extremely low 164 keywords with the highest impression count of 79 and dropping from there. MOZ is show very few minor issues although it says that it crawled 10k pages? -- we only have 30 or so. The answer seems obvious, Google is not showing my content ... the question is why and what steps can I take to analyze this? Could there be a possibility of some type of penalty? I welcome all your suggestions: The site is www.calibersi.com
Technical SEO | | VanadiumInteractive0 -
Can up a page
I do my best to optimize the on-page parameters for my page www.lkeria.com/AADL-logement-Algerie.php for the kw "aadl" but i can't understand what Ii'm doing wrong (i desapear 2 mounths ago). The page is optimize (title, description, h1, h2 etc.) few links with different ancers, but google put a spamy site www[dot]aadl[dot]biz in top 3 ratheer my page. Can you give me some advice to fix this issue? What I am doing wrong? Tanks in advance
Technical SEO | | lkeria0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Home page indexed but not ranking...interior pages with thin content outrank home page??
I have a Joomla site with a home page that I can't get to rank for anything beyond the company name @ Google - the site works fine @ Bing and Yahoo. The interior pages will rank all day long but the home page never shows up in the results. I have checked the page code out in every tool that I know about and have had no luck....by all account it should be good to go...any thoughts/comments/help would be greatly appreciated. The site is http://www.selectivedesigns.com Thanks! Greg
Technical SEO | | DougHosmer0 -
Wordpress html page
Hi, WE are designing new agency site which contain from then 100 page. Which URL is best excample.com/about/ or excample/about.html excample.com/service/ or excample/service.html
Technical SEO | | srinathk0 -
301 redirecting some pages directly, and the rest to a single page
I've read through the Redirect guide here already but can't get this down in my .htaccess I want to redirect some pages specifically (/contactinfo.html to the new /contact.php) And I want all other pages (not all have equivalent pages on the new site) to redirect to my new (index.php) homepage. How can I set it up so that some specific pages redirect directly, and all others go to one page? I already have the specific oldpage.html -> newpage.php redirects in place, just need to figure out the broad one for everything else.
Technical SEO | | RyanWhitney150 -
What is the largest page size a searchbot will crawl?
When setting up pagination, what should we limit the page size to? When will a searchbot stop crawling a particular page?
Technical SEO | | nicole.healthline0