Eliminate render blocking javascript and css recommendation?
-
Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this?
Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete.
The site is: www.kempruge.com
Thanks for any tips and/or suggestions,
Ruben
-
Yes, it's over a month, and for the most part of the month our page speed score was 66 ave for the 3 seconds. Now, with the adjustments I've made and switching to a new hosting company, we're at an 81 as of this morning. So, I guess if 3 seconds at a 66 isn't terrible, we'll probably be in an acceptable range following the improvements.
Either way, thanks so much for the industry stats and the article. It's easy to find "how to make your website faster" info, but MUCH more difficult to find an article that I can trust. Thanks for the tip!
Ruben
-
Hi Ruben,
That analytics data is over a month or so right? Just to make sure we are not talking about an unusually fast or slow day!
3 seconds is not too bad. It can depend a lot on the type of site you have or the industry. Check this for a recent rundown of stats by country/industry. Also check out this article for a good rundown of tactics to use in reducing load times.
I would look at doing some of the more easy fixes included in the above article (if you havent already) before you move to trying to adjust the script rendering issues, especially if you do not have an inhouse person that is comfortable doing it. If you have already done all of that, then really it is a matter of how much effort it will require to find someone to diagnose and make the needed changes to the site code versus how much load/rendering time that will shave off. Personally, I think it might not be worth it, but others may disagree
-
Thanks Lynn! Yes, they are from Google Page Speed Insights. Attached is our pagespeed times from GA. Unfortunately, I'm not sure if they're okay or not. I just don't know enough, other than, faster is usually better.
Your thoughts?
Thanks,
Ruben
-
Hi,
Are you getting this flag from google page speed insights? Render blocking scripts are basically scripts that are called in the beginning of the page (the head usually) but are not really used either for that page or for the content of that page that is immediately visible, so downloading them first delays the rendering of the page. Depending on the structure of your site/code, plugins used etc fixing this could be as simple as moving a couple of lines of code in the template or..... quite complicated indeed.
What are your page load times in google analytics looking like? I had a look at your page and it seemed to load pretty fast so I would check load times in GA and see if the problem is really as urgent as you think. The page speed insight tool will flag everything it sees, but sometimes it can give you kind of false positives and other times just be recommending things mechanically that are not a huge issue in the grand scale of things!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google blocks certain articles on my website ... !
Hello I have a website with more than 350 unique articles, Most of them are crawled by Google without a problem, but I find out certain articles are never indexed by Google. I tried to rewrite them, adding fresh images and optimizing them but it gets me nowhere. Lately, I rewrite an article of those and tried to (fetch and render) through Google Webmasters, and I found this result, can you tell me if there is anything to do to fix that? BMVh4
Intermediate & Advanced SEO | | Evcindex0 -
Magento Store Using Z-Blocks - Impact on SEO?
Hi Guys, I have a question relating to Z-Blocks in Magento. Our Magento store uses a lot of Z-Blocks, these are bits of content that are switched off and on depending on a customer’s user group. This allows us to target different offers and content to new customers (not logged in) and existing customers (logged in). Does anyone have any experience in how this impacts SEO? Thanks in advance!
Intermediate & Advanced SEO | | CarlWint0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Favorite SEO firm you would recommend
Is there a favorite SEO firm that you would recommend. Is there any site that ranks the top firms in the country?
Intermediate & Advanced SEO | | movieguide0 -
Block a country, will affect my ranking?
Dear Mozzers, I intend to block some certain countries from viewing my website (including proxy), will it affect my Google ranking? Thank you for your help. BR/Tran
Intermediate & Advanced SEO | | SteveTran20130 -
CSS Display None / Hidden? Will I get in Trouble?
Hi, We're integrating over a dozen of videos to the site to be featured in a slideshow manner. A selected video will be featured in the center of the page, meanwhile the user can click on the small thumbnails and change it to something else. For the selected videos, there will be a transcript shown right next to the video. The trick is, we cant show the transcripts for all the videos at once, since that's just bad user experience - the page will be miles long. We want to hide the transcripts for the videos which are not shown either in a div - or in some other Google friendly manner. The question is? Is this Google - legit? Is there a chance of being flagged, since so much content will be hidden?
Intermediate & Advanced SEO | | sophia1231 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0 -
Is User Agent Detection still a valid method for blocking certain URL parameters from the Search Engines?
I'm concerned with the cloaking issue. Has anyone successfully implemented user agent detection to provide the Search engines with "clean" URLs?
Intermediate & Advanced SEO | | MyaRiemer0