Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Passing link juice via javascript?
Hello Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a> <a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a> <a href...="" ).<="" p="">Regards,</a>
Intermediate & Advanced SEO | | PenaltyHammer0 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
Does google credit links from iFrames or created by Javascript, if so, is one more powerful than the other?
Consider this example, because I want to be clear about what I mean. You have two websites. Lets all them www.a.com and www.b.com. On www.a.com/some/page, there is an iframe something like this:
Intermediate & Advanced SEO | | adriandg
<iframe src="www.b.com/some/special/path"></iframe>
Then content of this iframe is a bunch of pictures, text and numbers, as well as a group of links, linking each picture to www.b.com for example the links might be:
www.b.com/content/1
www.b.com/content/2
www.b.com/content/3 Questions: When google crawls **www.a.com/some/page, **does it pass link juice to www.b.com/content/*? Does google instead consider these to be internal links within b.com itself. because links to www.b.com/content/ ** are actually from b.com itself, since the domain of the iframe is actually: www.b.com/some/special/path 3) Is there any amount of link juice passed from www.a.com/some/page to* www.b.com/some/special/path **because this is the src= element of an iframe that a.com is hosting? Consider an alternative setup. Where instead of using an iframe the contents of the above described iFrame is actually added the the page dynamically using javascript, and a call to an API endpoint at b.com. Resulting in these links being added directly to the body of a.com without being wrapped in an iframe element. Questions:
4) Do these links that were created after page load still get crawled and credited by google? (i have heard in the past that google was going to start crawling javascript, i just don't know if this is known for a fact yet).
5) Do links created on the client side hold the same weight as a link that was served directly via the backend html generation? If both the links within the iframe and the links within the javascript embed method pass link juice. Is one preferred over the other? is one known to be more effective than the other? Thanks!0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
High resolution (retina) images vs load time
I have an ecommerce website and have a product slider with 3 images. Currently, I serve them at the native size when viewed on a desktop browser (374x374). I would like to serve them using retina image quality (748px). However how will this affect my ranking due to load time? Does Google take into account image load times even though these are done asynchronously? Also as its a slider, its only the first image which needs to load. Do the other images contribute at all to the page load time?
Intermediate & Advanced SEO | | deelo5551 -
Multiple IPs (load balancing) for same domain
Hello, I'm considering moving our main website to a multiple servers, perhaps in multiple different datacenters and use a DNS round robin load balancing by assigning it 4 different IP addresses (probably from 4 different C classes). example:
Intermediate & Advanced SEO | | maddogx
ourdomain.com A 1.1.1.1
ourdomain.com A 2.2.2.2
ourdomain.com A 3.3.3.3
ourdomain.com A 4.4.4.4 Every time you ping the domain you will get a response from another IP of the group. Therefore search engines will see a different IP each time they scan the site. We have used the main IP for our website for past 6 years without changing it. We have a quite good SEO in our niche which I don't want to loose of course. My question is, will adding more IPs to the domain affect any how on the ranking ? What is the suggested way to do it anyway? What is recommended to do before and after? Thanks for you attention and help in advance. Dmitry S.0