Lazy loading workarounds?
-
Hi, I was hoping for some concise answers on lazy loading and seo.
From the user standpoint, lazy loading images in our galleries makes the most sense by reducing load times without needing to paginate our content onto multiple page loads (i.e. all our content on one page, but loading as needed).
example gallery: http://roadcyclinguk.com/news/gear-news/pro-bikes-franco-pellizottis-bianchi-sempre-pro.html
The flip side is that this ends up with our images not getting fetched properly and I was wondering what the options are around this.
It has been suggested that we add a
<noscript>tag against the images, but I wanted to check that this will get read properly by the googlebots.</p> <p>Thanks all!</p></noscript>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating 301 Redirects to Decrease Page Load Times - Major Concerns?
Hello, I am being pushed to consolidate our over 6k redirects that have accumulated over the course of 4 years. These redirects are one of the many factors causing extensive load times for our website. Many to most or over a year old, have not been used, or simply redirect back to the home page. Other than looking to keep the pages that have external links (also looking for recommendations/tools), are there other best practices from an SEO stand point to ensure there are no major hits to our website. A little more info, I am looking to pair 6K down by Removing all Redirects that have not been used Removing all redirects that are over 1 yr+ Remove all redirects that redirect to simply the home page or a smaller big bucket subfolder
Technical SEO | | Owner_Account
This should take the number from 6K to around 300. Are there any major concerns? Pat0 -
How fast should a page load to get a Green light at Googles PageSpeed?
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test? Is there a number in seconds? Do we know that?
Technical SEO | | ziiiva1230 -
If the order of products on a page changes each time the page is loaded, does this have a negative effect on the SEO of those pages?
Hello, a client of mine has a number of category pages that each have a list of products. Each time the page is reloaded the order of those products changes. Does this have a negative effect on the pages' rankings? Thank you
Technical SEO | | Kerry_Jones2 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Late loading content via AJAX - impact to bots
Hi, In an attempt to reduce the latency of our site, we are planning on late-loading all content below the fold via AJAX. My concern is Googlebot won't see this content and won't index it properly (our site is very large and we have lots of content). What is good for our users is not necessarily good for bots. Will late loading AJAX content be read by Googlebot? Thoughts on how to balance speed vs search engine crawl-ability?
Technical SEO | | NicB10 -
A website that will not load on a particular computer? Help Me Please!
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Technical SEO | | RobertFisher
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site. A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem. Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?0 -
Javascript late loaded content not read by Gogglebot
Hi, We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value. I've read Google doesn't weigh <noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
Technical SEO | | NicB10 -
Good technical parameters worst load time.
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
Technical SEO | | sesertin0