Mysterious ave pageload time spikes since major redesign
-
Hello Moz Community,
About six months ago, we completely redesigned our heavily trafficked website. We updated the navigation, made the site responsive, and refreshed all the site's content. We were hoping to get a rankings boost from all the hard work we put in, but sadly our traffic began to steadily decline. We started to notice that although overall page load speeds were comparable before and after the redesign if you compared them on an hourly basis, we saw random hourly spikes in ave page load speed post redesign.
Here is a pic of our analytics comparing our hourly ave. page load speeds pre vs. post redesign: https://screencast.com/t/8WQeyhquHN (after is in blue, before in orange)
We have spent around 3 months trying to figure out the underlying cause of the new load time spikes. My question is has anyone seen anything like this before? Does anyone have any suggestions what might be causing the spikes? As far as we can tell, the spikes are indeed random and are not correlated to any particular time of day, our traffic, or other activity we are doing. Any help would be greatly appreciated!
Thanks,
Eric
-
Quick thing to check - have a look at your server logs. Those kinds of random load time increases look typical of the load spikes that are typical of bot overloads - where spambots may be hitting your pages in bursts, causing the server to overload and slow (or even reset, as Vijay mentions).
Those bot hits aren't recorded in GA - you'll have to look at the actual server access logs to find them.
Hope that helps?
Paul
-
Hi Eric,
Thanks for your response, please do update about the final solution for the benefit of all.
Let me know if you have further queries.
Regards,
Vijay
-
We have not detected anything wrong with our site from a user perspective-- that is what is so frustrating. Thanks for your time and response!
-
Thank you Vijay, I am having our developer take a look at all of our scripts.
-
Your page seems to load fine, have you personally seen the website load horribly at the times the chart and analytics indicates that it is? I don't think there is anything wrong with your site, so if the chart is actually accurate then it can only be the web host is dropping the ball
-
Hi Eric,
We had faced a similar problem with one of our clients, in the end, it was some scripts (both front-end and back-end) which were not ending/terminating properly and overloading the server. The script overload meant the server responses became slow till the resources were exhausted and reset by server mechanism. To summarize, it might be attributed to some bad scripting code.
I hope this helps.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are correcting missing meta descrption tags a good use of time?
My modest website (shew-design.com) has pulled up nearly sixty crawl errors. Almost all of them are missing meta description tags. One friend who knows SEO better than me says that adding meta tags to EVERY page is not a good use of time. My site is available at shew-design.com I'm just getting started in being serious about applying SEO to our site and I want to make sure I'm making the best use of my time. The other error I'm getting are duplicate page names within different directories (e.g. getting started (for branding), getting started (for web). Is this a huge priority? Would welcome your feedback.
Technical SEO | | Eric_Shew0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
The Mysterious Case of Pagination, Canonical Tags
Hey guys, My head explodes when I think of this problem. So I will leave it to you guys to find a solution... My root domain (xxx.com) runs on WordPress platform. I use Yoast SEO plugin. The next page of root domain -- page/2/ -- has been canonicalized to the same page -- page/2/ points to page/2/ for example. The page/2/ and remaining pages also have this rel tags: I have also added "noindex,follow" to page/2/ and further -- Yoast does this automatically. Note: Yoast plugin also adds canonical to page/2/...page/3/ automatically. Same is the case with category pages and tag pages. Oh, and the author pages too -- they all have self-canonicalization, rel prev & rel next tags, and have been "noindex, followed." Problem: Am I doing this the way it should be done? I asked a Google Webmaster employee on rel next and prev tags, and this is what she said: "We do not recommend noindexing later pages, nor rel="canonical"izing everything to the first page." (My bad, last year I was canonicalizing pages to first page). One of the popular blog, a competitor, uses none of these tags. Yet they rank higher. Others following this format have been hit with every kind of Google algorithm I could think of. I want to leave it to Google to decide what's better, but then again, Yoast SEO plugin rules my blog -- okay, let's say I am a bad coder. Any help, suggestions, and thoughts are highly appreciated. 🙂 Update 1: Paginated pages -- including category pages and tag pages -- have unique snippets; no full-length posts. Thought I'd make that clear.
Technical SEO | | sidstar0 -
Website redesign launch
Hello everyone, I am in the process of having my consulting website redesigned and have a question about how this may impact SEO. I will be using the same URL as I did before, just simply replacing an old website with a new website. Obviously the URL structure will change slightly since I am changing navigation names. Page titles will also change. Do I need to do anything special to ensure that all of the pages from the old website are redirected to the new website? For example, should I do a page level redirect for each page that remains the same? So that the old "services" page is pointed to the new "services" page? Or can I simply do a redirect at the index page level? Thank you in advance for any advice! Best, Linda
Technical SEO | | LindaSchumacher0 -
Anyone else seeing increased duplication of domains since Penguin?
Hi Is it just me or are the Google SERPs showing more duplication of domains since the penguin update. As an example if I search for "business Christmas cards" on google.co.uk then results 2, 3 and 17 are from the same domain. Similarly results 4, 20, 21 and 22 are the same domain. All results are "reasonable" in that they are designed to catch traffic for variations on this term BUT I'm sure google used to filter this duplication per-penguin. Am I imagining this increased duplication of domains? Gary
Technical SEO | | gtrotter6660 -
Open site explorer. Possibility to select a time period?
Is it possible to add a time period to the open site explorer? E.g. The external links of other websites grew very fast since February. It would be nice to have a feature with which it is possible to select the time period (from February to today). Without this function it is solely possible to see ALL links of all months and years. I would like to see only the external links since e.g. February. Thanks in advance.
Technical SEO | | TeunTibaco0 -
Time on site
From what I understand, if you search for a keyword say "blue widgets" and you click on a result, and then spend 10 seconds there, and go back to google and click on a different result google will track that first result as being not very relevant. What I don't understand is what happens when (and this happens all the time, i did it today) you click on a result go to that page, find it (not?) relevant and then get distracted, phone call, or someone calls you into another room in the office. You end up accidentally leaving the tab open all day long, and never go back to the google search. So your time on site to google is what? infinity? there must be an upper cap here? at some point they must say, ok, the user is gone, time on site = our maximum = 5 minutes?!? Get me? any insight?
Technical SEO | | adriandg0 -
Good technical parameters worst load time.
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
Technical SEO | | sesertin0