Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At Listing Page Load More Functionality or Pagination which one best?
Hello Experts, For my ecommerce site at product listing page which functionality is best to implement Load more or pagination 1,2,3....and why? Thanks!
Intermediate & Advanced SEO | | Johny123451 -
Will two navigation components (one removed by Javascript) impact Google rankings?
We are trying to eliminate tedium when developing complexly designed responsive navigations for mobile, desktop and tablet. The changes between breakpoints in our designs are too complex to be handled with css, so we are literally grabbing individual elements with javascript and moving them around. What we'd like to do instead is have two different navigations on the page, and toggle which one is on the DOM based on breakpoint. These navigations will have the same links but different markup. Will having two navigation components on the page at page load negatively impact our Google SEO rankings or potential to rank, even if we are removing one or the other from the DOM with JavaScript?
Intermediate & Advanced SEO | | CaddisInteractive0 -
Pagination, Javascript & SEO
Hi I need some help identifying whether we need to rethink the way we paginated product pages, On this page http://www.key.co.uk/en/key/workbenches when clicking page 1,2, etc - we have javascript to sort the results, the URL displayed and the URL linked to are different. e.g. The URL for these paginated pages is for example: page2 http://www.key.co.uk/en/key/workbenches#productBeginIndex:30&orderBy:5&pageView:list& Then the arrows either side of pagination, link to the paginated landing page e.g. http://www.key.co.uk/en/key/workbenches?page=3 - this is where the rel/prev details are - done for Google However - when clicking on this arrow, the URL loaded is different again - http://www.key.co.uk/en/key/workbenches#productBeginIndex:60&orderBy:5&pageView:list& & doesn't take you http://www.key.co.uk/en/key/workbenches?page=3 I did not set this up, but I am concerned that the URL http://www.key.co.uk/en/key/workbenches?page=3 never actually loads, but it's linked to Google can crawl it. Is this a problem? I am looking to implement a view all option. Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Robots.txt - blocking JavaScript and CSS, best practice for Magento
Hi Mozzers, I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento. I'm concerned we are blocking bots from crawling essential information for page rank. My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not? You can view our robots.txt file here Thanks, Blake
Intermediate & Advanced SEO | | LeapOfBelief0 -
Dilemma: Should we use pagination or 'Load More' Function
In the interest of pleasing Google with their recent updates and clamping down on duplicate content and giving a higher preference to pages with rich data, we had a tiny dilemma that might help others too. We have a directory like site, very similar to Tripadvisor or Yelp, would it be best to: A) have paginated content with almost 40 pages deep of data < OR > B) display 20 results per page and at the bottom have "Load More" function which would feed more data only once its clicked. The problem we are having now is that deep pages are getting indexed and its doing us no good, most of the juice and page value is on the 1st one, not the inner pages. Wondering what are the schools of thought on this one. Thanks
Intermediate & Advanced SEO | | danialniazi0 -
404 Error on Blog Pages that Look Like Loading Fine
There was recently a huge increase in 404 errors on Yandex Webmasters corresponding with a drop in rankings. Most of the pages seem to be from my blog (which was updated around the same time). When I click on the links from Yandex the page looks like it is loading normal, expect that it has the following message from the Facebook plugin I am using for commenting Any ideas about what the problem is or how to fix it? Critical Errors That Must Be Fixed | Bad Response Code: | URL returned a bad HTTP response code. | Open Graph Warnings That Should Be Fixed | Inferred Property: | The 'og:url' property should be explicitly provided, even if a value can be inferred from other tags. |
Intermediate & Advanced SEO | | theLotter
| Inferred Property: | The 'og:title' property should be explicitly provided, even if a value can be inferred from other tags. |
| Small og:image: | All the images referenced by og:image should be at least 200px in both dimensions. Please check all the images with tag og:image in the given url and ensure that it meets the recommended specification. |0 -
Ever Wise to Intentionally Use Javascript for Global Navigation?
I may be going against the grain here, but I'm going to throw this out there and I'm interested in hearing your feedback... We are a fairly large online retailer (50k+ SKUs) where all of our category and subcategory pages show well over 100 links (just the refinement links on the left can quickly add up to 50+). What's worse is when you hover on our global navigation, you see the hover menu (bot sees them as ) of over 80 links. Now I realize the good rule of thumb is not to exceed 100 links on a page (and if you did your math, you can see we already exceeded that well before we let the bots get to the good stuff we really wanted them to crawl in the first place). So... Is it wise to intentionally shield these global nav links from the bots by using javascript?
Intermediate & Advanced SEO | | mrwestern0