Lazy Loading of Blog Posts and Crawl Depths
-
Hi Moz Fans,
We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths.
We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO.
A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic!
I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
-
This is fantastic - Thank you!
-
Lazy load and infinite scroll are absolutely not the same thing, as far as search crawlers are concerned.
Lazy-loaded content, if it exists in the dom of the page will be indexed but it's importance will likely be reduced (any content that requires user interaction to see is reduced in ranking value).
But because infinite scroll is unmanageable for the crawler (it's not going to stay on one page and keep crawling for hours as every blog post rolls into view) Google's John Mueller has said the crawler will simply stop at the bottom of the initial page load.
This webinar/discussion on crawl and rendering from just last week included G's John Mueller and a Google engineer and will give you exactly the info you're looking for, right from the horse's mouth, Victoria.
To consider though - the blog's index page shouldn't be the primary source for the blog's content anyway - the individual permalinked post URLs are what should be crawled and ranking for the individual post content. And the xml sitemap should be the primary source for google's discovery of those URLs. Though obviously linking from authoritative pages will help the posts, but that's going to change every time the blog index page updates anyway. Also, did you know that you can submit the blog's RSS feed as a sitemap in addition to the xml sitemap? It's the fastest way I've found of getting new blog posts crawled/indexed.
Hope that helps!
Paul
-
I'm afraid I don't have an insight into how Google crawls with lazy loading.
Which works better for your user, pagination or lazy loading? I wouldn't worry about lazy loading and Google. If you're worried about getting pages indexed then I would make sure you've got a sitemap that works correctly.
-
Great, thank you
Do you have any insight into crawl depth too?
At what point would Google stop crawling the page with lazy loading? Is it best to use pagination as opposed to infinite scroll? -
With lazy loading, the code can actually still be seen in the source code. That's what Google uses, so you should be fine with using this as it's becoming a common practice now.
-
Yes, it's similar to the BBC page and loads when it is needed by the user so to speak.
It increased the site loading, but do you know at what point Google would stop indexing the content on our site?
How do we ensure that the posts are being crawled and is pagination the best way to go?
-
I'd have to say, not too familiar with the method you are using, but I take it the idea is elements of the page load as you scroll like BBC?
If it decreases the load time of the site that is good for both direct and indirect SEO, But the key thing is can Google see the contents of the page or not? - Use Google Search Console and fetch the page to see if it contains the content.
Also, Google will not hang around on your site, if it doesn't serve the content within a reasonable amount of time it will bounce off to the next page, or the next site to crawl. It's harsh, but it's a fact.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Archiving Blog Posts To Another Category Changes Permalink But Does It Affect SEO?
I'm launching a website that will give daily updates. The /daily/ category needs to be kept clear for the current day updates only, so each day I will be archiving the previous days updates to another folder, for example: /archive/. Each morning, when I archive the previous days post... the system will 301 the current url from /daily/ to /archive/ and the sitemap will be updated to reflect the change. What I am concerned about is my site will be packed with 301's and the information is more important on the day so I would expect the majority of backlinks + social shares will be to the /daily/ category url and visitors will be 301'd to the new url. How would this affect my SEO and is there a cleaner way to do this so?
Intermediate & Advanced SEO | | AjazMozPro1 -
Effect of Publishing Blog Posts Resembling Classified Advertisements
Our site (www.nyc-officespace-leader.com) markets commercial real estate for lease in New York City. Any potential negative impact in terms of ranking and traffic by using our blog post in an unconventional manner? I am considering publishing a weekly post describing the latest commercial listings for lease. The post would be formatted and resemble classified advertising appearing in such newspapers as The New York Times. The ads are concise and appealing. Property listings drive a high click thru rate, so I believe blogs posts based on property listings and formatted like old newspaper ads might really improve visitor engagement. Each add could have a link to a corresponding listing page. Would using the blog in this manner every week have a detrimental effect or could prove beneficial? Thoughts??? lr6MIiR
Intermediate & Advanced SEO | | Kingalan10 -
Blog URL Canonical
Hi Guy's, I would like to know your thoughts on the following set-up for blog canonical. Option 1 domain.com/blog = <link rel="canonical" href="domin.com/blog"> domain.com/blog-category/general = <link rel="canonical" href="domain.com/blog"> domain.com/blog-article/how-to-set-canonical = no canonical option 2 domain.com/blog = <link rel="canonical" href="domin.com blog"="">(as option 1)</link rel="canonical" href="domin.com> domain.com/blog-category/general = <link rel="canonical" href="domain.com blog-category="" general"="">(this time has the canonical of the category)</link rel="canonical" href="domain.com> domain.com/blog-article/how-to-set-canonical = <link rel="canonical" href="domain.com blog-article="" how-to-set-canonical"="">(this time has the canonical of the article full URL)</link rel="canonical" href="domain.com> Just not sure which is the best option, or even if it is any of the above! Thanks Dan
Intermediate & Advanced SEO | | Dan1e10 -
Website up for 3 months still not crawl
Hi, I had this website www.reliabledegree.com up for three months, but it is still not crawl by Google. All my on page optimization is grade A but I cannot see any single page crawl by Google. What can I do ? Please advise. Thanks. David
Intermediate & Advanced SEO | | Stevejobs20110 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Multiple IPs (load balancing) for same domain
Hello, I'm considering moving our main website to a multiple servers, perhaps in multiple different datacenters and use a DNS round robin load balancing by assigning it 4 different IP addresses (probably from 4 different C classes). example:
Intermediate & Advanced SEO | | maddogx
ourdomain.com A 1.1.1.1
ourdomain.com A 2.2.2.2
ourdomain.com A 3.3.3.3
ourdomain.com A 4.4.4.4 Every time you ping the domain you will get a response from another IP of the group. Therefore search engines will see a different IP each time they scan the site. We have used the main IP for our website for past 6 years without changing it. We have a quite good SEO in our niche which I don't want to loose of course. My question is, will adding more IPs to the domain affect any how on the ranking ? What is the suggested way to do it anyway? What is recommended to do before and after? Thanks for you attention and help in advance. Dmitry S.0 -
Is it worth submitting a blog's RSS feed...
to as many RSS feed directories as possible? Or would this have a similar negative impact that you'd get from submitting a site to loads to "potentially spammy" site directories?
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
How to Disallow Specific Folders and Sub Folders for Crawling?
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google. Google Search Result I aware about use of robots.txt file and can disallow images folder to solve this issue. But, It may block my images to get appear in Google image search. So, How can I fix this issue?
Intermediate & Advanced SEO | | CommercePundit0