GoogleBot Mobile & Depagination
-
I am building a new site for a client and we're discussing their inventory section. What I would like to accomplish is have all their products load on scroll (or swipe on mobile). I have seen suggestions to load all content in the background at once, and show it as they swipe, lazy loading the product images. This will work fine for the user, but what about how GoogleBot mobile crawls the page?
Will it simulate swiping? Will it load every product at once, killing page load times b/c of all of the images it must load at once? What are considered SEO best practices when loading inventory using this technique.
I worry about this b/c it's possible for 2,000+ results to be returned, and I don't want GoogleBot to try and load all those results at once (with their product thumbnail images). And I know you will say to break those products up into categories, etc. But I want the "swipe for more" experience. 99.9% of our users will click a category or filter the results, but if someone wants to swipe through all 2,000 items on the main inventory landing page, they can. I would rather have this option than "Page 1 of 350".
I like option #4 in this question, but not sure how Google will handle it.
I asked Matt Cutts to answer this, if you want to upvote this question.
https://www.google.com/moderator/#11/e=adbf4&u=CAIQwYCMnI6opfkj -
What you ideally want to do is set up the mobile site as a standard site. Then utilize javascript to call each page in an order defined by the users actions with dynamic loading.
This has two benefits:
-
SEO and SERP. The pages will be indexed as they should. If you have one huge page you are still limited to the 2 or 3 keywords as always. When you see a good infinite scroll website it is not one page, it only looks this way due to JavaScript calling additional pages at triggers that have been set.
-
No JavaScript graceful fallback (or fallforward as it is actually the native state). If you have one page, lazy loading with JavaScript and they do not support it then you have 2,000 pages worth of images loading at one time which is otherwise known as a bounce.
You will want to build out the site with no consideration to the infinite scrolling (except in design ie. tile-able backgrounds for a smooth non stop flow) then apply the script after you have a logical site structure using silo'ed categories. Google bot, Google bot mobile and users who do not have JavaScript will all have a useable site and the SERPS will rank pages as they should.
Tip: Keep any page wide bar or graphic styles in the header or the footer of the page. You will normally only call the content or article portion of the page to the infinite scroll so you have a non-stop flow on the site.
Hope this helps
I know your not using WordPress but I am assuming you are using some sort of templated PHP script for a 2K product store. This WP plugin is pretty easy to understand and what I first used to grab the concept. Also, if wanting to go a more Pinterest route look into Masonry JavaScript. http://www.infinite-scroll.com/
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt, Disallow & Indexed-Pages..
Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!
Intermediate & Advanced SEO | | thekiller990 -
Desktop vs. Mobile Website - ranking impact
Working on develop mobile pages using dynamic serving method, we are planing on only develop number of important pages (not the whole site) to be mobile friendly. To keep the consistency of the user experience, the new mobile site will only have internal links to pages that are mobile friend. Questions: If an existing non-mobile page ranking #1 on mobile SERP today, this page will not have a mobile friendly version, and will not link in the mobilefriendly site. will there be any impact to the ranking. Assuming: When Google mobile/Smartphone bots will not see a link to this page. The page will still accessible to Google desktop bots.
Intermediate & Advanced SEO | | tomchu0 -
Redirecting old mobile site
Hi All, Trying to figure out the best option here. I have a website that used to utilize a separate mobile site (m.xyz.com) but now utilizes responsive design. What is the best way to deal with that old mobile site? De-index? 301 redirect back to the main site in the rare case someone finds the m. site somewhere? THanks! Ricky
Intermediate & Advanced SEO | | RickyShockley0 -
Disavowal & Reconsideration request - Can I do one without the other?
I submitted a link disavowal file for a client a few weeks ago and before doing that I read up on how to properly use the tool. My understanding is that if you received a manual penalty then you need to submit a reconsideration request after cleaning up links. We didn't receive a penalty so I didn't submit one. I'm wondering if anyone has used the tool (not stemming from a penalty) and if you did or didn't submit a recon. request, and what the results were. I've read that if a site is hit algorithmically, then filing a recon request won't help. Should I just do it anyway? Would be great to hear from anyone who has gone through a similar situation.
Intermediate & Advanced SEO | | Vanessa120 -
Does Mobile optimised site improve ranking and how to index it faster?
Hi i have several question with regards to mobile optimised site: Does having a mobile optimised site improve ranking in SERP? How can we push/index mobile optimised sites to users searching on mobile sites faster? e.g. returning m.abc.com or abc.com/m to users seraching on mobile earlier.
Intermediate & Advanced SEO | | FWSBIO0 -
Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
We currently have all content on one URL and use # and Javascript refresh to paginate pages, and we are wondering if we transition to the Google's recommended pagination if we will see an improvement in traffic. Has anyone gone though a similar transition? What was the result? Did you see an improvement in traffic?
Intermediate & Advanced SEO | | nicole.healthline0 -
SEOMoz Internal Dupe. Content & Possible Coding Issues
SEOmoz Community! I have a relatively complicated SEO issue that has me pretty stumped... First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point. Now, the issues I am having: I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments). I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages: Car Financing How It Works A Home Loan is Possible with Bad Credit (Please let me know if you could use more examples) At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords. All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not! I could really use some significant help here... Both of our sites have a number of access points: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such. However, my sites can also be reached via the following: http://www.federalautoloan.com http://www.federalautoloan.com/default.aspx http://www.federalmortgageservices.com http://www.federalmortgageservics.com/default.aspx Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags? I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details. Thank you all in advance for the help! I greatly appreciate it! F7dWJ.png dN9Xk.png dN9Xk.png G62JC.png ABL7x.png 7yG92.png
Intermediate & Advanced SEO | | WPColt0 -
Building a mobile site.
We are building a mobile site that will be launching in another month. I’m concerned that the mobile site will start catabolizing our traditional rankings. Is there a way to keep this from happening? Should we utilize the cross domain canonical tag and point back to the traditional site URLs?
Intermediate & Advanced SEO | | SEO-Team0