What is your opinion in the use of jquery for a continuous scroll type of page layout?
-
So, I'm in 2 minds about this; let me start with a bit of background info.
Context
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links.Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time.
In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site.
My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here.
Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous".
The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order.It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here.
Can anyone possibly shed some light on this with either some further reading or first-hand experience?
-
Hi Michael, thanks for the input.
I completely agree with you about content length but perhaps I'm a little confused here.
If search engines aren't going to see content hidden by jquery, would that not mean the only content I'm seen to have is what's immediately visible on page load?
-
Google is NOT going to see the content that's rendered by scrolling. In general, more is better in terms of content on a single page (provided it's not crap of course). See this article from Search Engine Land.
For those same reasons, having it on separate pages isn't as good an idea. If you think about how RankBrain is supposed to work, Google is going to be looking for terms on the page that commonly co-occur with the page's primary target search term on other pages on the web about that topic. So, by farming subsections of content out to other pages, you're shooting yourself in the foot, as Google is only going to give you brownie points for covering the subtopics in the very first page.
A better way to do this:
- put all the content on one page
- in the onload() or the Jquery document ready function, hide all but the first page's worth of content
- now, you can react to a scroll by calling Jscript functions to hide the currently shown content and show the next page's worth...all on the same URL
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
Location of body text on page - at top or bottom - does it matter for SEO?
Hi - I'm just looking at the text on a redesigned homepage. They have moved all the text to the very bottom of the page (which is quite common with lots of designers, I notice - I usually battle to move the important text back up to the top). I have always ensured the important text comes at the top, to some extent - does it matter where on the page the text comes, for SEO? Are there any studies you can point me to? Thanks for your help, Luke
Web Design | | McTaggart2 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Dynamic pages -windows server
Hi all, Hope I get an answer on clients site which I believe is hosted on a windows shared server. The output of the site is something like this: http://www.domainname.com/catering-sub.asp?maincate_id=6&maincate_name=Barware I am looking to get a URL friendly output for the site - as far as my knowledge I believe Htaccess doesn't work on this type of hosting? thoughts? Thanks in advance
Web Design | | OnlineAssetPartners0 -
I am looking to improve my on page seo, can you provide any recommendations or suggestions for how?
I am relatively new to the world of SEO and recently built a new site. I have read as many books as I can to help increase my skill set rapidly, and have attempted to implement the best of what I have learned but I know many of you have been in this arena for a while and I would be extremely appreciative of any suggestions you can offer with regard to on page. Thanks in advance. http://luxuryhomehunt.com - home page http://luxuryhomehunt.com/homes-for-sale/orlando.html - city level http://luxuryhomehunt.com/homes-for-sale/orlando/bay-hill.html - community level
Web Design | | Jdubin0 -
Using H1 Headings - More than 1?
I've known about avoiding the use of more than 1 H1 Heading Tags, however, with HTML5 is this going to change... at least that's how I understand it. According to HTML5 Specs, Each 'section' can have an H1 heading, which at least theoretically means certain web pages that have multiple "sectioning elements" can have more than 1 H1 heading... true? False? What I'm looking for here is some insight into the ramifications HTML5 will have on the use of H1 tags. And would like to know how search engines currently handle this and are they anticipated to change as the HTML5 outline algorithm becomes widely supported? thanks in advance Kelly
Web Design | | KellysTutorials0 -
Infinite Page Scrolling for e-commerce Product Catetegories
Hi There, I would like to know what's the pros and cons of Infinite Page Scrolling for e-commerce Product Categories that have over 700 products. Sample here Secondly how will this effect our on page SEO as far as google concerned? Many Thanks
Web Design | | Jvalops0