Does Lazy Loading Create Indexing Issues of products?
-
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
-
Hello Vinay,
Please see Mashable for an example:
http://mashable.com/2013/4/They have pagination links at the bottom of the page and use lazy loading / infinite scroll.
Adam Sherk has a good post about this:
http://www.adamsherk.com/seo/seo-tips-for-infinite-scrolling/ -
Everett i got ur point you mean ajax for users and pagination for spiders. can you show me one exp. that will help me alot.
Thanks
vinay
-
View the source of the cached page and look toward the bottom. Do all of your products/listings show in the source? If so, you're all good. If not, you may want to add pagination for the spiders, as mentioned above.
-
thanks Everett Sizemore.
I just check cashe as you suggested. and i found lazy loading is also working on cashed page. that means everything ok?
-
Great suggestions Everett: "If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results."
-
Hello,
Where did you read that Google only indexes or reads a maximum of 1,000 links on a page? I think this is outdated information. However, it is best practice not to have that many links on a page even if Google does crawl more than 100 or 1,000 per page.
So to answer your question, yes if you're loading additional product listings via javascript after the user scrolls down it could cause Google to only render part of the page. However, often this does not cause "indexation issues" for your product pages because they have other paths into them from sub-categories, related product links, external links, sitemaps, etc...
If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results. That should answer your question directly from Google.
I usually recommend pagination links be added to the page so Google, or users those without javascrpt enabled, has a path to access more product listings. If you like you can set those paginated category pages to noindex,follow so they do not get indexed, but Google can still crawl them to find deeper products.
-
I have never heard of any negative aspects on SEO when using lazy load... our shop has nearly 100.000 products and we use lazy load as well... we recently could raise the numbers of indexed pages form 5.5 mio. to 6.7 mio. ... so this is just an example but to answer your question from my personal point of view: the answer would be no
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Referral source not indexed or showing up in GSC
I've been doing a lot of research about this and have not been able to find an answer just yet. Google analytics is showing over 43k referrals from about 35 different spam sources. I checked the hostname thinking that they were ghost referrals and I was surprised to see that they all show our domain so that part is disqualified. The next thing I did was to look at the referral path to look at the pages that were pointing to the site and when I clicked to launch the link the window loaded YouTube or did not load at all. After doing a bit of research I came across **Disavowing Links, **at first it sounded like the perfect solution for this, but after reading all the warnings that everyone gives I decided to spend more time researching and to use that as a last resource. I proceeded to check Google Search Console to identify those backlinks and to make sure they were coming up there as well. To my surprise, none of these links show up in GSC. Neither for the www or the non-www property. I have decided to avoid disavowing the links before making sure that this is the correct thing to do. Although it may still seem like it is, I want to ask for an expert opinion or if anyone else has experienced this. If GSC doesn't see them it means that Google is not indexing them, my problem is that GA still sees them and that concerns me. I don't want this to affect our site by getting penalized, or by losing ranking. Please help!
White Hat / Black Hat SEO | | dbmiglpz0 -
Tool to check google index status for backlinks?
I would like to check to see which backlink urls are indexed in Google. Is there a tool that can automate this work or will I have to do it manually?
White Hat / Black Hat SEO | | Choice0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Rel Canonical and Rel No Index, Follow
Hi, Cant implement rel next and prev as getting difficulty in coding - tried lot for same, but to no luck... Considering now rel=canonical and rel noindex,follow to 2 sections Deals and Discounts - We have been consistenly ranking on first position for over 1.5 yr, however recently slipped to position 4,5 on many keywords in this section URL - http://www.mycarhelpline.com/index.php?option=com_offers&view=list&Itemid=9 here, the page content for page 1 and 2 pertains to the current month and from page 3 to all other pages pertains to previous months. Is adding up rel canonical from page 3 to last page to page 1 - makes sense & also simultaneously add noindex, follow from page 3 to last page News & Reviews Section - Here, all news & article items are posted. Been the links of news items are primarily there. However, the pages are not duplicates, does adding noindex, follow makes sense here URL - http://www.mycarhelpline.com/index.php?option=com_latestnews&view=list&Itemid=10 Look forward for recommendations to implement the best - to gain SERP, avoid duplicate and white hat method.. Many thanks
White Hat / Black Hat SEO | | Modi0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0