Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
JavaScript navigation causing an SEO problem?
Hi - I'm looking at a site using JavaScript dropdown navigation - Google can crawl the whole site but my thinking is this - If I ensure the dropdown navigation is functioning fully when JS is switched off, I may facilitate the search engine bots? At the moment I can't get any dropdown effect if I turn JS off on the site but if I look at a cached page (text version) the dropdown links are visible and working. I am wondering whether any crawl benefit is there if you take this a step further and ensure the drop downs are actually visible and working when JS is switched off? I would welcome your thoughts on this. Thanks in advance, Luke - 07966 729775
Intermediate & Advanced SEO | | McTaggart0 -
Can you explain why a site with loads of keywork anchor backlinks is ranking well?
Hi All, A couple of years ago my site got punished and i kind of figured out it was due to keyword anchor text backlinks. I recently was considering getting a new SEO company and found one that promotes another company in my industry. However, when i look at the backlinks, of the website they do SEO for, via open site explorer - I noticed nearly all of the backlinks are keywords. website in question is http://goo.gl/6nRzLi And if you see the search results here: http://goo.gl/9bXoxY they are ranking on first page with very big brands - and have done for about a year - but nealy all of their backlinks are the keywords in this search... I understood this is the kind of thing that killed my website 2 years ago - are these backlinks ok? should i still consider this SEO company to work on my site, i 100% do not want my site to be penalised again so any advice appreciated. thanks James
Intermediate & Advanced SEO | | isntworkdull0 -
Google and JavaScript
Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?
Intermediate & Advanced SEO | | CleverPhD0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
Javascript
Hi there, Quick question: Does Google parse javascript? I have a html ad which contain the anchor text linking to one of our product pages, however the ad unit are javascript based and from this the code is not visible on page source through the browser. Kind Regards
Intermediate & Advanced SEO | | Paul780 -
How can we optimize content specific to particular tabs, but is loaded on one page?
Hi, Our website generates stock reports. Within those reports, we organize information into particular tabs. The entire report is loaded on one page and javascript is used to hide and show the different tabs. This makes it difficult for us to optimize the information on each particular tab. We're thinking about creating separate pages for each tab, but we're worried about affecting the user experience. We'd like to create separate pages for each tab, put links to them at the bottom of the reports, and still have the reports operate as they do today. Can we do this without getting in trouble with Google for having duplicate content? If not, is there another solution to this problem that we're not seeing? Here's a sample report: http://www.vuru.co/analysis/aapl In advance, thanks for your help!
Intermediate & Advanced SEO | | yosephwest0 -
Issues with Load Balancers?
Has anyone ran into SEO issues with sites utilizing load balancing systems? We're running into some other technical complications (for using 3rd party tracking services), but I'm concerned now that the setup could have a not-so-good impact from an SEO standpoint.
Intermediate & Advanced SEO | | BMGSEO0 -
Google, Links and Javascript
So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?
Intermediate & Advanced SEO | | bemcapaz0