How can I Improve Loading Speed? - Parker Dubrule Lawyers
-
Parker Dubrule Lawyers' website at parkerdubrulelawyers.com seemed to be loading quite slow this morning (>5 Seconds). I added a lazyload plugin, minified JS and CSS, and ensured that the images were optimized---all of this seemed to help and brough it down to under 2 seconds. We are looking at more reliable hosting options for our clients---ones that are inherently faster possibly without these plugins being added to the mix. Does anyone have insight on a safe, secure, and fast hosting/server option to enhance the experience from the get go? All of the websites that we build are in Worpress.
Your help is much appreciated! Thanks!
-
Thank you for such a detailed response!
-
Thank you for your insight!
-
Hosting & DNS
It looks like the DNS response is all over the place. Sometimes it's acceptable... at ~100ms, other times not so much. A better DNS provider would be worth looking into. Amazon Route 53 or Dyn are pretty good options.
For shared hosting, I can second SiteGround. It's a solid host for lower budgets. DigitalOcean is a very solid and inexpensive VPS, but there will be less hand holding. I plan on migrating to DO in the next week or so. My current host just removed sudo privileges from their VPS accounts. I know, right!?!
Sweet, sweet PHP 7 and Redis - here I come.
Things You Can Fix Immediately
Run the site through Pagespeed Insights. Make a punch list and go from there. There's also a download link for 'optimized' resources. Usually I only take images from that. More on that later.
One of the big ones is 'Remove Render Blocking JavaScript. The quick fix is moving Web Font Loader script. and the GA script to the footer. You're halfway there, in a lot of instances.
Images
A couple of slider images are still over 200KB. If there's anything you can do to reduce that, do so. The Pagespeed Insights tool states that the home page has a couple images that could be compressed further. Even though the savings are minuscule, it adds up over requests.Fonts
One of the better performance increases can be had with fonts. Again, move your Web Font Loader script to the footer. Consider using fewer character sets. Do you really need greek-ext, cyrillic or vietnamese character sets? If not, remove them.
Another fun one is using the preconnect tag. Here's a practical guide to web font performance from an author at CSS-Tricks. Just make sure to use fonts.gstatic, instead of fonts.typonine in the code snippet. Here's a fairly detailed reason why you want to use preconnect, from Ilya Grigorik. (Seriously, follow that guy if you're not already doing so.)
CSS, HTML & JavaScript
The site appears to have unminified CSS inlined in the head of the document. That's a lot of CSS, and it probably isn't all critical path. In order to render a page as fast as possible, you need to display visible content first. Drop your various style sheets into the critical path CSS generator.
You can input your newly generated critical path CSS into Autoptimize. It's a very handy plugin that minifies HTML, CSS and JavaScript. It will also combine your CSS & JavaScript to reduce requests. Another handy feature includes, you guessed it, Inline Critical Path CSS.
You will likely have to remove BWP Minify, as Autoptimize handles most - if not all - of those functions. More is not better, in this instance. In fact, you should disable any caching plugin options which handle minification.
Gzip Compression & Cache Expiration
It looks like cache expiration settings aren't setup for some basic MIME types (CSS, JPEG, etc.) Consider setting up a caching plugin, such as Super Cache or Total Cache. Failing that, this is one of the better htaccess settings repos.
***Edit: One of the issues involves query strings in static resources. Here's a good resource, with a few options to handle that.
As always make these changes in a test environment. And best of luck. You'll probably be happier, with a lot of projects.
-
I use iClickAndHost, SiteGround as shared hosting. I also use Amazon EC2, Linode, DigitalOcean, Vultr as VPS. For CDN - Amazon CloudFront and S3. Everything works perfect.
But you should diagnose your hosting issues before considering switching them. Can be something temporary - DDoS, hardware failure, network overload, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bad html code hurt your website from ranking ?
Hello,For example if I search for “ Bike Tours in France” I am looking for a page with a list of tours in France.Does it mean that if my html doesn’t have list * in the code but only that apparently doesn’t have any semantic meaning for a search engine my page won’t rank because of that ?Example on this page : https://bit.ly/2C6hGUn According to W3schools: "A semantic element clearly describes its meaning to both the browser and the developer. Examples of non-semantic elements: <div> and - Tells nothing about its content. Examples of semanticelements: <form>, , and- Clearly defines its content."Has anyone any experience with something similar ?Thank you, </form>
Technical SEO | | seoanalytics0 -
Can anyone confirm they have predictable rating shifts based on the day of the week?
Hey all, The site in question is https://amygalbraith.com The keyword I am trying to rank for (among others) is 'Seattle Wedding Photographer'. I have seen a truly strange pattern emerge. Each Monday in the past two weeks, this site has been ranked in the top 20, usually around 11-14. Come Tuesday, it will drop completely out of the top 50, usually around page 7-8. Does anyone know of anything that might account for that kind of erratic shift in placement? We aren't making any changes to the site: no a/b testing or anything is occurring at the moment. I'm baffled. It could be a fluke. You can bet I will be trying to get screenshots this coming Monday.
Technical SEO | | amygalbraith0 -
Can anyone tell me - in layman's terms - any SEO implications of a Netscaler redirect?
We are in the midst of exploring the best options for developing a "microsite" experience for a client and how we manage the site - subdomain vs. subdirectory... Netscaler redirect vs DNS change. We understand that a subdirectory is best for SEO purposes; however, we anticipate technical limitations when integrating the different hosting platforms and capabilities into the existing site. The proposed solutions that were provided are a netscaler redirect and/or dns changes. Any experience with these solutions?
Technical SEO | | jgrammer0 -
Video & Graph That Lazy Loads
Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks
Technical SEO | | Maratha0 -
How can I tell Google not to index a portion of a webpage?
I'm working with an ecommerce site that has many product descriptions for various brands that are important to have but are all straight duplicates. I'm looking for some type of tag tht can be implemented to prevent Google from seeing these as duplicates while still allowing the page to rank in the index. I thought I had found it with Googleoff, googleon tag but it appears that this is only used with the google appliance hardware.
Technical SEO | | bradwayland0 -
Can Anybody Understand This ?
Hey guyz,
Technical SEO | | atakala
These days I'm reading the paperwork from sergey brin and larry which is the first paper of Google.
And I dont get the Ranking part which is: "Google maintains much more information about web documents than typical search engines. Every hitlist includes position, font, and capitalization information. Additionally, we factor in hits from anchor text and the PageRank of the document. Combining all of this information into a rank is difficult. We designed our ranking function so that no particular factor can have too much influence. First, consider the simplest case -- a single word query. In order to rank a document with a single word query, Google looks at that document's hit list for that word. Google considers each hit to be one of several different types (title, anchor, URL, plain text large font, plain text small font, ...), each of which has its own type-weight. The type-weights make up a vector indexed by type. Google counts the number of hits of each type in the hit list. Then every count is converted into a count-weight. Count-weights increase linearly with counts at first but quickly taper off so that more than a certain count will not help. We take the dot product of the vector of count-weights with the vector of type-weights to compute an IR score for the document. Finally, the IR score is combined with PageRank to give a final rank to the document. For a multi-word search, the situation is more complicated. Now multiple hit lists must be scanned through at once so that hits occurring close together in a document are weighted higher than hits occurring far apart. The hits from the multiple hit lists are matched up so that nearby hits are matched together. For every matched set of hits, a proximity is computed. The proximity is based on how far apart the hits are in the document (or anchor) but is classified into 10 different value "bins" ranging from a phrase match to "not even close". Counts are computed not only for every type of hit but for every type and proximity. Every type and proximity pair has a type-prox-weight. The counts are converted into count-weights and we take the dot product of the count-weights and the type-prox-weights to compute an IR score. All of these numbers and matrices can all be displayed with the search results using a special debug mode. These displays have been very helpful in developing the ranking system. "0 -
How can you add meta discription for your homepage in yoast wordpress SEO?
Hello everyone Can some please tell me how to add meta description for your homepage in yoast wordpress SEO. I've wrote the description in titles and Metas section but it didn't work. Only my title page worked fine. I really need help on this one. thank you
Technical SEO | | nashnazer0 -
What has to be changed to improve rank?
What has to be changed to improve rank? We had "hip hop jewelry" keyword for a while, position 4. All of the sudden it dropped for position 6 and never went back. We did some on page optimization and got couple of links here and there... but so far we still at position 6. Please suggest us what has to be done?
Technical SEO | | DiamondJewelryEmpire0