How can I Improve Loading Speed? - Parker Dubrule Lawyers
-
Parker Dubrule Lawyers' website at parkerdubrulelawyers.com seemed to be loading quite slow this morning (>5 Seconds). I added a lazyload plugin, minified JS and CSS, and ensured that the images were optimized---all of this seemed to help and brough it down to under 2 seconds. We are looking at more reliable hosting options for our clients---ones that are inherently faster possibly without these plugins being added to the mix. Does anyone have insight on a safe, secure, and fast hosting/server option to enhance the experience from the get go? All of the websites that we build are in Worpress.
Your help is much appreciated! Thanks!
-
Thank you for such a detailed response!
-
Thank you for your insight!
-
Hosting & DNS
It looks like the DNS response is all over the place. Sometimes it's acceptable... at ~100ms, other times not so much. A better DNS provider would be worth looking into. Amazon Route 53 or Dyn are pretty good options.
For shared hosting, I can second SiteGround. It's a solid host for lower budgets. DigitalOcean is a very solid and inexpensive VPS, but there will be less hand holding. I plan on migrating to DO in the next week or so. My current host just removed sudo privileges from their VPS accounts. I know, right!?!
Sweet, sweet PHP 7 and Redis - here I come.
Things You Can Fix Immediately
Run the site through Pagespeed Insights. Make a punch list and go from there. There's also a download link for 'optimized' resources. Usually I only take images from that. More on that later.
One of the big ones is 'Remove Render Blocking JavaScript. The quick fix is moving Web Font Loader script. and the GA script to the footer. You're halfway there, in a lot of instances.
Images
A couple of slider images are still over 200KB. If there's anything you can do to reduce that, do so. The Pagespeed Insights tool states that the home page has a couple images that could be compressed further. Even though the savings are minuscule, it adds up over requests.Fonts
One of the better performance increases can be had with fonts. Again, move your Web Font Loader script to the footer. Consider using fewer character sets. Do you really need greek-ext, cyrillic or vietnamese character sets? If not, remove them.
Another fun one is using the preconnect tag. Here's a practical guide to web font performance from an author at CSS-Tricks. Just make sure to use fonts.gstatic, instead of fonts.typonine in the code snippet. Here's a fairly detailed reason why you want to use preconnect, from Ilya Grigorik. (Seriously, follow that guy if you're not already doing so.)
CSS, HTML & JavaScript
The site appears to have unminified CSS inlined in the head of the document. That's a lot of CSS, and it probably isn't all critical path. In order to render a page as fast as possible, you need to display visible content first. Drop your various style sheets into the critical path CSS generator.
You can input your newly generated critical path CSS into Autoptimize. It's a very handy plugin that minifies HTML, CSS and JavaScript. It will also combine your CSS & JavaScript to reduce requests. Another handy feature includes, you guessed it, Inline Critical Path CSS.
You will likely have to remove BWP Minify, as Autoptimize handles most - if not all - of those functions. More is not better, in this instance. In fact, you should disable any caching plugin options which handle minification.
Gzip Compression & Cache Expiration
It looks like cache expiration settings aren't setup for some basic MIME types (CSS, JPEG, etc.) Consider setting up a caching plugin, such as Super Cache or Total Cache. Failing that, this is one of the better htaccess settings repos.
***Edit: One of the issues involves query strings in static resources. Here's a good resource, with a few options to handle that.
As always make these changes in a test environment. And best of luck. You'll probably be happier, with a lot of projects.
-
I use iClickAndHost, SiteGround as shared hosting. I also use Amazon EC2, Linode, DigitalOcean, Vultr as VPS. For CDN - Amazon CloudFront and S3. Everything works perfect.
But you should diagnose your hosting issues before considering switching them. Can be something temporary - DDoS, hardware failure, network overload, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I avoid duplicate brand name in the title serp?
Hello, How can I avoid duplicate brand name in the title serp? For example:
Technical SEO | | jh0sz
In this page: https://www.latam.com/es_cl/ The title setted is: <title>LATAM Airlines en Chile | Sitio Oficial</title>
But in the SERP show: LATAM Airlines en Chile | Sitio Oficial - LATAM.com Can I avoid LATAM.COM at the end of the title? Regards 8J3jEAX1 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Using a Colo Load Balancer to serve content
So this is a little complicated (at least for me...) We have a client who is having us rebuild and optimize about 350 pages of their website in our CMS. However, the rest of the website will not be on our CMS. We wanted to build these pages on a sub-domain that is pointed to our IPs so it could remain on our CMS--which the client wants. However, they want the content on a sub-directory. This would be fine but they will not point the main domain to us and for whatever reason this becomes impossible per their Dev team. They have proposed using a Colo Load Balancer to deliver the content from our system (which will be on the sub domain) to their sub directory. This seems very sketchy to me. Possible duplicate content? Would this be a sort of URL masking? How would Google see this? Has anyone ever even heard of doing anything like this?
Technical SEO | | Vizergy0 -
Can panda penalize News publisher sites?
Hey Guys,I was wondering how Panda behaves with news publisher sites.A site with +-1M visits a day that publishes +-300 news articles a day and the life of each article is one week top, given the nature of a news articles -->only relevant now.After one week the the news articles have virtually no page views. This results on a site with thousands of quality content pages that has no page views for years.Is it possible that the site gets penalized by panda for having thousands of pages with no visits?
Technical SEO | | Mr.bfz0 -
Google rankings strange behaviour - our site can only be found when searching repeatedly
Hello, We are experiencing something very odd at the moment I hope somebody could shed some light on this. The rankings of our site dropped from page 2 to page 15 approx. 9 months ago. At first we thought we had been penalised and filed a consideration request. Google got back to us saying that there was no manual actions applied to our site. We have been working very hard to try to get the ranking up again and it seems to be improving. Now, according to several serps monitoring services, we are on page 2/3 again for the term "holiday lettings". However, the really strange thing is that when we search for this term on Google UK, our site is nowhere to be found. If you then right away hit the search button again searching for the same term, then voila! our website is on www.alphaholidaylettings.com page 2 / 3! We tried this on many different computers at different locations (private and public computers), making sure we have logged out from Google Accounts (so that customised search results are not returned). We even tried the computers at various retail outlets including different Apple stores. The results are the same. Essentially, we are never found when someone search for us for the first time, our site only shows up if you search for the same term for the second or third time. We just could not understand why this is happening. Somebody told me it could be due to "Google dance" when indices on different servers are being updated, but this has now been going on for nearly 3 months. Has anyone experienced similar situations or have any advice? Many thanks!
Technical SEO | | forgottenlife0 -
Lazy loading workarounds?
Hi, I was hoping for some concise answers on lazy loading and seo. From the user standpoint, lazy loading images in our galleries makes the most sense by reducing load times without needing to paginate our content onto multiple page loads (i.e. all our content on one page, but loading as needed). example gallery: http://roadcyclinguk.com/news/gear-news/pro-bikes-franco-pellizottis-bianchi-sempre-pro.html The flip side is that this ends up with our images not getting fetched properly and I was wondering what the options are around this. It has been suggested that we add a <noscript>tag against the images, but I wanted to check that this will get read properly by the googlebots.</p> <p>Thanks all!</p></noscript>
Technical SEO | | ChrisTalin0 -
Can 404 results from external links hurt site ranking?
Hello, I'm helping a university transition to a brand new website. In some cases the URLs will change between the old site and new site. They will put 301 redirects in place to make sure that people who have old URLs will get redirected properly to the new URLs. However they also have a bunch of old pages that they aren't using anymore. They don't really care if people still try to get to them (because they don't think many will), but they do care about the overall search engine rankings. I know that if a site has internal 404 links, that could hurt rankings. However can external links that return a 404 hurt rankings? Ryan
Technical SEO | | GreenHatWeb0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0