Best practices for lazy loading (content)
-
Hi all,
We are working on a new website and we want to know the best practices for lazy loading of google for content.
My best sample is: bloomberg.com , look at their homepage.Thank y'all!
-
Hi John! In order to get you an answer that directly relates to what you're trying to do, would you be able to give us more information about your goals with this? As in, what sorts of pages, specifically, you're intending to implement lazy loading on? And as Sergey asked, is it for the site to load faster for users? For overall user experience?
-
Hey Sergey,
I'm looking for a solution for laziloading and not for pagination in lazy-loading.....thanks anyway..
-
Hi John,
First of all - the Google Webmaster Blog has written about infinite content (although not specifically lazy loading) here. Might be worth checking out.
Second, my question to you would be what is your goal with implementing lazy loading on your site? Is it for the site to load faster for users? For overall user experience?
Here is a thread on Reddit talking about this situation, I think /Twoary explains it well. Here's a quote:
"As far as I have experimented with it, it seems like they can indeed not find scroll-based lazy loading (in webmaster tools). Another possibility is onload lazyloading (first load all the content above the fold, then load the content below the fold after the onload event has fired), I have to experiment more with that.
Right now I avoid lazy loading for SEO for articles and such. The fact is that google only cares about "time to first byte". Maybe soonish they will care about "time until above the fold page is loaded". But they do not penalize for the time it takes for all of the resources to be loaded. Apart from that, google mostly cares about user experience which they measure by actual dwell time of their users.
As for the user experience, lazy loading images doesn't add that much benefit either. The browser downloads images near the top of your page first, so the above the fold content isn't downloaded any faster with lazy load. (Possibly even slower because the browser won't be able to start prefetching lazy loaded images until javascript executes.)
The only benefit I see right now is for reducing bandwidth usage (for your site and for mobile users). However the disadvantage will be that your images probably won't rank as well (even if you use pagination/a sitemap.)
OTOH, lazy loading other heavy content such as videos, iframes and ads may be much more beneficial because those actively make the page more sluggish."
-
Yes, I'm using wordpress.
-
Are you using a CMS? There are some great plugins for various different platforms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
What is the recommended or "best practice" Permalink Structure?
I have always been under the impression that by connecting pages to their parent pages as described in a.) below is best practice and makes sense to me. a.) yoursite.com/category/sub-category/product/ b.) yoursite.com/product But then i also understand the importance in terms of link juice being spread out across so many sub pages, and by using Example b.) you keep the link juice in tact. Your thoughts on this? Greg
Technical SEO | | AndreVanKets0 -
Content on top-level-domain vs. content on subpage
Hello Seomoz community, I just built a new website, mainly for a single affiliate programm and it ranks really well at google. Unfortunately the merchant doesn’t like the name of my domain, that’s why I was thrown out of the affiliate program. So suppose the merchant is a computer monitor manufacturer and his name is “Digit”. The name of my domain is something like monitorsdigital.com at the moment. (It’s just an example, I don’t own this URL). The structure of my website is: 1 homepage with much content on it + a blog. The last 5 blog entries are displayed on the homepage. Because I got kicked out of the affiliate program I want to permanent redirect monitorsdigital.com to another domain. But what should the new website look like? I have two possibilities: Copy the whole monitorsdigital website to a new domain, called something like supermonitors.com. Integrate the monitorsdigital website into my existing website about different monitor manufacturers. E.g.: allmonitors.com/digit-monitors.html (that url is permitted by the merchant) What do you think is the better way? I just got the impression, that it seems to be a little easier to rank high with a top-level-domain (www.supermonitors.com) than with a subpage (www.allmonitors.com/digit-monitors.html). However the subpage can benefit from the domain authority, that was generated by other subpages. Thanks for your help and best regards MGMT
Technical SEO | | MGMT0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Best Way To Handle Expired Content
Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
Technical SEO | | Matthew_Edgar
Matthew1 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0