Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Loading images below the fold? Impact on SEO
-
I got this from my developers. Does anyone know if this will be a SEO issue?
We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
-
Happy to help!
-
Thanks Tom!
As always, an amazing response.
Best
-
Hi Chris sorry for the late reply absolutely you can do this by using a plug-in cloudfare or PHP code
- https://wordpress.org/plugins/wp-deferred-javascripts/
- https://wordpress.org/plugins/defer-css-addon-for-bwp-minify/
Another plugin that does this solution but providing an administration area to configure it manually is Autoptimize, that allows to define a specific CSS code in a independent way of your theme CSS stylesheet
- http://www.oxhow.com/optimize-defer-javascript-wordpress/
- https://seo-hacker.com/optimizing-site-speed-asynchronous-deferred-javascript/
- http://www.laplacef.com/how-to-defer-parsing-javascript-in-wordpress/
The solution of these problem is removing those render-blocking scripts. But if you remove them, some plugins may not work properly. So, the best solution for the smooth rendering is:
1. Remove them from your website source page.
2. Use a single script, hosted by Google as the alternative.
3. Push down the new script at end of the page ( before “” tag).
Here is how to do it.
Copy the code from the following link and paste at your theme’s function.php file.
function optimize_jquery() { if (!is_admin()) { wp_deregister_script('jquery'); wp_deregister_script('jquery-migrate.min'); wp_deregister_script('comment-reply.min'); $protocol='http:'; if($_SERVER['HTTPS']=='on') { $protocol='https:'; } wp_register_script('jquery', $protocol.'//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js', false, '3.6', true); wp_enqueue_script('jquery'); } } add_action('template_redirect', 'optimize_jquery');
Save the file and you are done! Now recheck the source of any page and you won’t see those two scripts at the head section. Alternatively, you can see the Google hosted JavaScriptscript source at the end of the page.
That’s all! Now the visible section of your page will be rendered smoothly.
Defer Loading JavaScript
Another suggestion from Google Page Speed tool is “Defer JavaScripts”. This problem happens when you use any inline JavaScripts like the scripts for Facebook like box or button, Google plus button, Twitter button etc. If you defer the JavaScript then the scripts are triggered after loading of the entire document.
How to defer JavaScript at WordPress
1. Create a JavaScript file and give the name as defer.js.
2. Place the JavaScripts codes that you want to defer into the defer.js file. For instance, if you want to defer Facebook like box script, paste the following at that file.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1&appId=326473900710878"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));
3. Save the file and upload at your theme folder.
4. Now, copy the following code and paste at the head section of the source page. Here in WordPress, open header.php file of your theme and paste the code before the closing head tag.
Make sure to put the correct path of defer.js. For example, the source path should be like this:
/wp-content/themes/theme_name/defer.js ______________________________________________________________________________________________
I hope that helps,
Tom
-
happy I could help
-
Thomas,
Can this be implemented on a Wordpress site?
Apologize for hijacking!
-
What a great response! Just what I was looking for. Thank you!
-
lazy loading images is not as good as deferring an image. Because lazy loading images can cause issues can cause JavaScript issues that will not cause problems if you deferred the image instead of lazy loading.
Defer images you will have a easier time the method discussed here does not hurt search engine optimization in fact it will help it because increased load speeds or what people perceive as an increased load speed always helps the end-user.
Here is the best way
https://www.feedthebot.com/pagespeed/defer-images.html
This is where we defer the images without lazy loading
In the scenario of a one page template, there is no reason to do all the things that lazy loading does (observe, monitor and react to a scroll postion).
Why not just defer those images and have them load immediately after the page has loaded?
How to do it
To do this we need to markup our images and add a small and extremely simple javascript. I will show the method I actually use for this site and others. It uses a base 64 image, but do not let that scare you.
The html
The javascript
-
I have looked for information on this in the past and come up empty handed. With page speed Google really pits you against best SEO practices. I think if you follow most of the page speed insights you can severely limit your SEO. How many images are you talking about, how does Google render the page in their fetch as Google?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
Hidden H1 Tag on Image
Hi, In the page I'm working on, I encountered an tag in an image, rather than in a text form. Do you think it's an issue when it comes to SEO?
Technical SEO | | nerdieb
What do you suggest I should do if there is an issue? Keen to hear from you!0 -
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
JavaScript page loader - SEO impact
Hello all,
Technical SEO | | Lvet
I am working on a site that has a bizarre page load system. All pages get loaded trough the same Javascript snippet, for example: Changing the values in the form changes the page that is loaded. The most incredible thing is that, against my expectations, pages do get indexed by Google.
My question is: "Does loading pages dynamically using JavaScript affect the overall SEO performance?" Why are pages getting indexed? Thank you for shedding light on this.
Cheers
Luca0 -
Will blocking the Wayback Machine (archive.org) have any impact on Google crawl and indexing/SEO?
Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett
Technical SEO | | BBuck0 -
SEO for User Authenticated Content
Hi Everyone - I have a potential client who is seeking SEO for a site that contains about 95% of content only accessible through user authentication . Does anyone have tips for getting this indexed without having to open it up to the public? I was considering adding "snippets" into the robots.txt or creating an additional page with snippets linking to the login page. I'd appreciate any thoughts! Thanks!
Technical SEO | | manutx0 -
Changing DNS -- SEO implications?
Hey Moz, We're migrating an old site on an old server over to a new server/DNS. The plan is to keep the same URL structure and reuse our existing URL's. As long as we make minimal changes to each page's content, we should be able to update our DNS entry and get all the pages recreated and assigned to their correct URLs without any reduction in SEO rankings. Is this correct? This site gets a lot of organic traffic and ranks highly on some challenging keywords, so it's key that we retain our rankings as much as possible. I've read that it's wise to lower the DNS time-to-live to one hour, about a day before the move, to help Google crawl the DNS a little quicker. Are there any other recommendations you guys can offer or past experiences?
Technical SEO | | stephen_reply0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0