Eliminate render blocking javascript and css recommendation?
-
Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this?
Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete.
The site is: www.kempruge.com
Thanks for any tips and/or suggestions,
Ruben
-
Yes, it's over a month, and for the most part of the month our page speed score was 66 ave for the 3 seconds. Now, with the adjustments I've made and switching to a new hosting company, we're at an 81 as of this morning. So, I guess if 3 seconds at a 66 isn't terrible, we'll probably be in an acceptable range following the improvements.
Either way, thanks so much for the industry stats and the article. It's easy to find "how to make your website faster" info, but MUCH more difficult to find an article that I can trust. Thanks for the tip!
Ruben
-
Hi Ruben,
That analytics data is over a month or so right? Just to make sure we are not talking about an unusually fast or slow day!
3 seconds is not too bad. It can depend a lot on the type of site you have or the industry. Check this for a recent rundown of stats by country/industry. Also check out this article for a good rundown of tactics to use in reducing load times.
I would look at doing some of the more easy fixes included in the above article (if you havent already) before you move to trying to adjust the script rendering issues, especially if you do not have an inhouse person that is comfortable doing it. If you have already done all of that, then really it is a matter of how much effort it will require to find someone to diagnose and make the needed changes to the site code versus how much load/rendering time that will shave off. Personally, I think it might not be worth it, but others may disagree
-
Thanks Lynn! Yes, they are from Google Page Speed Insights. Attached is our pagespeed times from GA. Unfortunately, I'm not sure if they're okay or not. I just don't know enough, other than, faster is usually better.
Your thoughts?
Thanks,
Ruben
-
Hi,
Are you getting this flag from google page speed insights? Render blocking scripts are basically scripts that are called in the beginning of the page (the head usually) but are not really used either for that page or for the content of that page that is immediately visible, so downloading them first delays the rendering of the page. Depending on the structure of your site/code, plugins used etc fixing this could be as simple as moving a couple of lines of code in the template or..... quite complicated indeed.
What are your page load times in google analytics looking like? I had a look at your page and it seemed to load pretty fast so I would check load times in GA and see if the problem is really as urgent as you think. The page speed insight tool will flag everything it sees, but sometimes it can give you kind of false positives and other times just be recommending things mechanically that are not a huge issue in the grand scale of things!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Can Googlebots read canonical tags on pages with javascript redirects?
Hi Moz! We have old locations pages that we can't redirect to the new ones because they have AJAX. To preserve pagerank, we are putting canonical tags on the old location pages. Will Googlebots still read these canonical tags if the pages have a javascript redirect? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Any arguments against eliminating all (non-blog) subfolders?
Short URLs seem to do the trick from a UX perspective. For example: /primary-care vs. /why/specialties/primary-care . This convention will be applied over 30-40 pages. Note that while "/why/specialties/primary-care" isn't terribly ugly, some of our pages would look a little overly-keywordy if we go with the subfolder approach.
Intermediate & Advanced SEO | | NueMD0 -
Pluggin to minify CSS
Can somenone please recommend a good pluggin to minify css upload so that my web runs faster as it uploads more than 20 css and i want to placed it all on the same css
Intermediate & Advanced SEO | | maestrosonrisas0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0 -
How can I block unwanted urls being indexed on google?
Hi, I have to block unwanted urls (not that page) from being indexed on google. I have to block urls like example.com/entertainment not the exact page example.com/entertainment.aspx . Is there any other ways other than robot.txt? If i add this to robot.txt will that block my other url too? Or should I make a 301 redirection from example.com/entertainment to example.com/entertainment.aspx. Because some of the unwanted urls are linked from other sites. thanks in advance.
Intermediate & Advanced SEO | | VipinLouka780