Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Passing link juice via javascript?
Hello Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a> <a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a> <a href...="" ).<="" p="">Regards,</a>
Intermediate & Advanced SEO | | PenaltyHammer0 -
Does google credit links from iFrames or created by Javascript, if so, is one more powerful than the other?
Consider this example, because I want to be clear about what I mean. You have two websites. Lets all them www.a.com and www.b.com. On www.a.com/some/page, there is an iframe something like this:
Intermediate & Advanced SEO | | adriandg
<iframe src="www.b.com/some/special/path"></iframe>
Then content of this iframe is a bunch of pictures, text and numbers, as well as a group of links, linking each picture to www.b.com for example the links might be:
www.b.com/content/1
www.b.com/content/2
www.b.com/content/3 Questions: When google crawls **www.a.com/some/page, **does it pass link juice to www.b.com/content/*? Does google instead consider these to be internal links within b.com itself. because links to www.b.com/content/ ** are actually from b.com itself, since the domain of the iframe is actually: www.b.com/some/special/path 3) Is there any amount of link juice passed from www.a.com/some/page to* www.b.com/some/special/path **because this is the src= element of an iframe that a.com is hosting? Consider an alternative setup. Where instead of using an iframe the contents of the above described iFrame is actually added the the page dynamically using javascript, and a call to an API endpoint at b.com. Resulting in these links being added directly to the body of a.com without being wrapped in an iframe element. Questions:
4) Do these links that were created after page load still get crawled and credited by google? (i have heard in the past that google was going to start crawling javascript, i just don't know if this is known for a fact yet).
5) Do links created on the client side hold the same weight as a link that was served directly via the backend html generation? If both the links within the iframe and the links within the javascript embed method pass link juice. Is one preferred over the other? is one known to be more effective than the other? Thanks!0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Javascript onclick redirects / porn sites...
We noticed around 7 websites which with domains that were just recently registered (with privacy protection). They are using our website keywords/titles and brand name and the sites are mostly porn / junk sites. They don't link to our website directly but use a javascript onclick redirect which is why we think we aren't seeing them in our backlinks report. We've been in business for over 12 years and haven't come across sites like this before. We recently lost our first page rankings for a few of our highest converting key phrases and have been digging in to possible causes. Just wondering if these sites could be impacting our results, and how to figure out if there are more like this? Examples: nesat.net
Intermediate & Advanced SEO | | EileenCleary
flowmeterdirectory.biz
finnsat.net
dotsjobs.net0 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
High resolution (retina) images vs load time
I have an ecommerce website and have a product slider with 3 images. Currently, I serve them at the native size when viewed on a desktop browser (374x374). I would like to serve them using retina image quality (748px). However how will this affect my ranking due to load time? Does Google take into account image load times even though these are done asynchronously? Also as its a slider, its only the first image which needs to load. Do the other images contribute at all to the page load time?
Intermediate & Advanced SEO | | deelo5551 -
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
Hi Moz, Here's more info on our problem, and thanks for reading! We’re trying to Create 301 redirects for 44 pages on site.com. We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs. These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2. The CMS is WordPress version 3.9.1 The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank. Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
Intermediate & Advanced SEO | | DA20130