Loading images below the fold? Impact on SEO
-
I got this from my developers. Does anyone know if this will be a SEO issue?
We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
-
Happy to help!
-
Thanks Tom!
As always, an amazing response.
Best
-
Hi Chris sorry for the late reply absolutely you can do this by using a plug-in cloudfare or PHP code
- https://wordpress.org/plugins/wp-deferred-javascripts/
- https://wordpress.org/plugins/defer-css-addon-for-bwp-minify/
Another plugin that does this solution but providing an administration area to configure it manually is Autoptimize, that allows to define a specific CSS code in a independent way of your theme CSS stylesheet
- http://www.oxhow.com/optimize-defer-javascript-wordpress/
- https://seo-hacker.com/optimizing-site-speed-asynchronous-deferred-javascript/
- http://www.laplacef.com/how-to-defer-parsing-javascript-in-wordpress/
The solution of these problem is removing those render-blocking scripts. But if you remove them, some plugins may not work properly. So, the best solution for the smooth rendering is:
1. Remove them from your website source page.
2. Use a single script, hosted by Google as the alternative.
3. Push down the new script at end of the page ( before “” tag).
Here is how to do it.
Copy the code from the following link and paste at your theme’s function.php file.
function optimize_jquery() { if (!is_admin()) { wp_deregister_script('jquery'); wp_deregister_script('jquery-migrate.min'); wp_deregister_script('comment-reply.min'); $protocol='http:'; if($_SERVER['HTTPS']=='on') { $protocol='https:'; } wp_register_script('jquery', $protocol.'//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js', false, '3.6', true); wp_enqueue_script('jquery'); } } add_action('template_redirect', 'optimize_jquery');
Save the file and you are done! Now recheck the source of any page and you won’t see those two scripts at the head section. Alternatively, you can see the Google hosted JavaScriptscript source at the end of the page.
That’s all! Now the visible section of your page will be rendered smoothly.
Defer Loading JavaScript
Another suggestion from Google Page Speed tool is “Defer JavaScripts”. This problem happens when you use any inline JavaScripts like the scripts for Facebook like box or button, Google plus button, Twitter button etc. If you defer the JavaScript then the scripts are triggered after loading of the entire document.
How to defer JavaScript at WordPress
1. Create a JavaScript file and give the name as defer.js.
2. Place the JavaScripts codes that you want to defer into the defer.js file. For instance, if you want to defer Facebook like box script, paste the following at that file.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1&appId=326473900710878"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));
3. Save the file and upload at your theme folder.
4. Now, copy the following code and paste at the head section of the source page. Here in WordPress, open header.php file of your theme and paste the code before the closing head tag.
Make sure to put the correct path of defer.js. For example, the source path should be like this:
/wp-content/themes/theme_name/defer.js ______________________________________________________________________________________________
I hope that helps,
Tom
-
happy I could help
-
Thomas,
Can this be implemented on a Wordpress site?
Apologize for hijacking!
-
What a great response! Just what I was looking for. Thank you!
-
lazy loading images is not as good as deferring an image. Because lazy loading images can cause issues can cause JavaScript issues that will not cause problems if you deferred the image instead of lazy loading.
Defer images you will have a easier time the method discussed here does not hurt search engine optimization in fact it will help it because increased load speeds or what people perceive as an increased load speed always helps the end-user.
Here is the best way
https://www.feedthebot.com/pagespeed/defer-images.html
This is where we defer the images without lazy loading
In the scenario of a one page template, there is no reason to do all the things that lazy loading does (observe, monitor and react to a scroll postion).
Why not just defer those images and have them load immediately after the page has loaded?
How to do it
To do this we need to markup our images and add a small and extremely simple javascript. I will show the method I actually use for this site and others. It uses a base 64 image, but do not let that scare you.
The html
The javascript
-
I have looked for information on this in the past and come up empty handed. With page speed Google really pits you against best SEO practices. I think if you follow most of the page speed insights you can severely limit your SEO. How many images are you talking about, how does Google render the page in their fetch as Google?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image sitemap
I work on a big eCommerce site with thousands of pages. We are talking about crating a separate image sitemap. Any idea example of an eCommerce site who has a separate image sitemap? I looked several and cant find one. Also, what are the best practices for creating a good image sitemap? thanks!
Technical SEO | | bizuH0 -
IFrame Persistent Music Player - SEO impact
Hi, I'd like to know if using an iFrame for a persistent music player on a website - e.g https://www.sampletoolsbycr2.com - would have any negative impact on SEO? For example, it doesn't cause any issues preventing the site being crawled? Thanks, Joe
Technical SEO | | iweb_agency0 -
Impact of Non SEO Subdomains
My company has several subdomains whose specific purpose is to act as a landing page/site for our paid search and/or email program. One of the things I've noticed on these subdomains is that they are not being excluded from the SEObots. Could the lack of proper SEO techniques on these subdomains impact our main www subdomain? What is the proper configuration we should use to make sure these sites are not considered for SEO?
Technical SEO | | APFM0 -
SEO Ultimate’s and Yoast together
We are using SEO Ultimate’s and Yoast together on a Wordpress website, does anyone have negative experience by using this to plugins together? The main result to use this, is because we could remove the /product/ from the slug. We want clean URL’s and not URL’s that have: website.com/product/PRO1 or website.com/product-catagory/CAT2 (standard Wordpress) But we want: website.com/PRO1 or website.com/CAT2 In Yoast is possible to remove the URL /category/, but not the /product/. With SEO Ultimate’s it is possible. So we use them both. SEO Ultimate’s for the removal of the URL, and Yoast for it’s great functions: Sitemap, no index /page2/, no index media etc. etc. But Yoast gives the following note: ‘’WordPress SEO (Yoast) is known to cause conflicts with SEO Ultimate. Please deactivate WordPress SEO if you wish to continue using SEO Ultimate.’’ I was wondering what this could mean for overal SEO performance. And maybe there is a way to remove the URL’s by using other methods > and remove the SEO Ultimate’s plugin.
Technical SEO | | Happy-SEO1 -
SEO of Social Media Pages
I have noticed something odd about how Google ranks social media pages, and was hoping someone would have a good explanation. When I search for a particular name in Google, the first two results are Twitter pages of two people who share the same name. #1 is an older account with more Tweets, but it has fewer followers, no external backlinks, and the URL is unrelated to the name #2 is a newer account, but it has more followers, a few external backlinks, and the name itself is in the URL. It has fewer overall Tweets, but has Tweeted more frequently over the past several months. #2 is also happens to be in the same City as I am. Given my understanding of Google's ranking factors, I would not have expected #1 to outrank #2. In fact, I would not have expected #1 to even be on the first page. What could be causing #1 to rank so highly? Does it make sense that the age of the account or the number of Tweets would affect SEO at all? Really, I am just trying to understand what are the main factors that determine the ranking of social media profile pages. Thanks
Technical SEO | | timsegraves0 -
Lazy loading workarounds?
Hi, I was hoping for some concise answers on lazy loading and seo. From the user standpoint, lazy loading images in our galleries makes the most sense by reducing load times without needing to paginate our content onto multiple page loads (i.e. all our content on one page, but loading as needed). example gallery: http://roadcyclinguk.com/news/gear-news/pro-bikes-franco-pellizottis-bianchi-sempre-pro.html The flip side is that this ends up with our images not getting fetched properly and I was wondering what the options are around this. It has been suggested that we add a <noscript>tag against the images, but I wanted to check that this will get read properly by the googlebots.</p> <p>Thanks all!</p></noscript>
Technical SEO | | ChrisTalin0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
How to best remove old pages for SEO
I run an accommodation web site, each listing has its own page. When a property is removed what is the best way to handle this for SEO because the URL will no longer be valid and there will be a blank page.
Technical SEO | | JamieHibbert0