Best practices for lazy loading (content)
-
Hi all,
We are working on a new website and we want to know the best practices for lazy loading of google for content.
My best sample is: bloomberg.com , look at their homepage.Thank y'all!
-
Hi John! In order to get you an answer that directly relates to what you're trying to do, would you be able to give us more information about your goals with this? As in, what sorts of pages, specifically, you're intending to implement lazy loading on? And as Sergey asked, is it for the site to load faster for users? For overall user experience?
-
Hey Sergey,
I'm looking for a solution for laziloading and not for pagination in lazy-loading.....thanks anyway..
-
Hi John,
First of all - the Google Webmaster Blog has written about infinite content (although not specifically lazy loading) here. Might be worth checking out.
Second, my question to you would be what is your goal with implementing lazy loading on your site? Is it for the site to load faster for users? For overall user experience?
Here is a thread on Reddit talking about this situation, I think /Twoary explains it well. Here's a quote:
"As far as I have experimented with it, it seems like they can indeed not find scroll-based lazy loading (in webmaster tools). Another possibility is onload lazyloading (first load all the content above the fold, then load the content below the fold after the onload event has fired), I have to experiment more with that.
Right now I avoid lazy loading for SEO for articles and such. The fact is that google only cares about "time to first byte". Maybe soonish they will care about "time until above the fold page is loaded". But they do not penalize for the time it takes for all of the resources to be loaded. Apart from that, google mostly cares about user experience which they measure by actual dwell time of their users.
As for the user experience, lazy loading images doesn't add that much benefit either. The browser downloads images near the top of your page first, so the above the fold content isn't downloaded any faster with lazy load. (Possibly even slower because the browser won't be able to start prefetching lazy loaded images until javascript executes.)
The only benefit I see right now is for reducing bandwidth usage (for your site and for mobile users). However the disadvantage will be that your images probably won't rank as well (even if you use pagination/a sitemap.)
OTOH, lazy loading other heavy content such as videos, iframes and ads may be much more beneficial because those actively make the page more sluggish."
-
Yes, I'm using wordpress.
-
Are you using a CMS? There are some great plugins for various different platforms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our original content is being outranked on search engines by smaller sites republishing our content.
We a media site, www.hope1032.com.au that publishes daily content on the WordPress platform using the Yoast SEO plugin. We allow smaller media sites to republish some of our content with canonical field using our URL. We have discovered some of our content is now ranking below Or not visible on some search engines when searching for the article heading. Any thoughts as to why? Have we got an SEO proble? An interesting point is the small amount of content we have republished is not ranking against the original author on search engines.
Technical SEO | | Hope-Media0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Duplicate content, Original source?
Hi there, say i have two websites with identicle content. website a had content on before website b - so will be seen as the original source? If the content was intended for website b, would taking it off a then make the orinal source to google then go to website b? I want website b to get the value of the content but it was put on website a first - would taking it off website a then give website b the full power of the content? Any help of advice much appreciated. Kind Regards,
Technical SEO | | pauledwards0 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0 -
Long load time
My site takes double the time per kb than my competitors. it hosted on shared hosting with Godaddy.com Any ideas why this may be happening?
Technical SEO | | atohad0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0