Does Lazy Loading Create Indexing Issues of products?
-
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
-
Hello Vinay,
Please see Mashable for an example:
http://mashable.com/2013/4/They have pagination links at the bottom of the page and use lazy loading / infinite scroll.
Adam Sherk has a good post about this:
http://www.adamsherk.com/seo/seo-tips-for-infinite-scrolling/ -
Everett i got ur point you mean ajax for users and pagination for spiders. can you show me one exp. that will help me alot.
Thanks
vinay
-
View the source of the cached page and look toward the bottom. Do all of your products/listings show in the source? If so, you're all good. If not, you may want to add pagination for the spiders, as mentioned above.
-
thanks Everett Sizemore.
I just check cashe as you suggested. and i found lazy loading is also working on cashed page. that means everything ok?
-
Great suggestions Everett: "If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results."
-
Hello,
Where did you read that Google only indexes or reads a maximum of 1,000 links on a page? I think this is outdated information. However, it is best practice not to have that many links on a page even if Google does crawl more than 100 or 1,000 per page.
So to answer your question, yes if you're loading additional product listings via javascript after the user scrolls down it could cause Google to only render part of the page. However, often this does not cause "indexation issues" for your product pages because they have other paths into them from sub-categories, related product links, external links, sitemaps, etc...
If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results. That should answer your question directly from Google.
I usually recommend pagination links be added to the page so Google, or users those without javascrpt enabled, has a path to access more product listings. If you like you can set those paginated category pages to noindex,follow so they do not get indexed, but Google can still crawl them to find deeper products.
-
I have never heard of any negative aspects on SEO when using lazy load... our shop has nearly 100.000 products and we use lazy load as well... we recently could raise the numbers of indexed pages form 5.5 mio. to 6.7 mio. ... so this is just an example but to answer your question from my personal point of view: the answer would be no
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
Backlink Indexing - will this technique hurt or help?
So I came across this idea on YouTube: Indexing your backlinks. I understand its not enough to just have google crawl your pages - you want them indexed. So, if you create backlinks on say a blog or social profile, will it benefit you to have them submitted to other popular blogs, news / pr sites, video channels - of which may be unrelated - for the sole purpose of getting them not just crawled but indexed? There are SEO companies that I have seen that claim they do exactly that (publish your backlinks all over the web - making backlinks for backlinks) but in reality is this a good thing or a bad thing? Could this help rankings or hurt them?
White Hat / Black Hat SEO | | momentum_technology_services0 -
Referral source not indexed or showing up in GSC
I've been doing a lot of research about this and have not been able to find an answer just yet. Google analytics is showing over 43k referrals from about 35 different spam sources. I checked the hostname thinking that they were ghost referrals and I was surprised to see that they all show our domain so that part is disqualified. The next thing I did was to look at the referral path to look at the pages that were pointing to the site and when I clicked to launch the link the window loaded YouTube or did not load at all. After doing a bit of research I came across **Disavowing Links, **at first it sounded like the perfect solution for this, but after reading all the warnings that everyone gives I decided to spend more time researching and to use that as a last resource. I proceeded to check Google Search Console to identify those backlinks and to make sure they were coming up there as well. To my surprise, none of these links show up in GSC. Neither for the www or the non-www property. I have decided to avoid disavowing the links before making sure that this is the correct thing to do. Although it may still seem like it is, I want to ask for an expert opinion or if anyone else has experienced this. If GSC doesn't see them it means that Google is not indexing them, my problem is that GA still sees them and that concerns me. I don't want this to affect our site by getting penalized, or by losing ranking. Please help!
White Hat / Black Hat SEO | | dbmiglpz0 -
Competitor Drops 10,000 links since last index. Lets play detective.
One of the intriguing things about SEO is being able to reverse engineer your competitors rankings because all the technical information is available for those who know where to look. I recently looked at my Dashboard and saw that one of my competitors had dropped 10,000 links. The questions is why? Google algorthm change? Blackhat Penalty? Something else.? Here are the numbers, I am going to lieave my own clients site out because his numbers are pathetic. www.Leafly(dot)com 50.4k Links Down 10k www.thcfinder(dot)com 1,530 links Down 71 www.weedmaps(dot)com 64,000k links Up 1.5K Is it just me or is that a lot of links to loose over one indexing period?
White Hat / Black Hat SEO | | DavidMeshah0 -
Sitemap issues 19 warnings
Hi Guys I seem to be having a lot of sitemap issues. 1. I have 3 top level domains, all with the co.nz sitemap that was submitted 2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks) 3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed) 4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap Would love some guidance around this. Thanks
White Hat / Black Hat SEO | | edward-may0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0