How does Progressive Loading, aka what Facebook does, impact proper search indexation?
-
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue).
I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading?
If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know.
thanks a ton!!
Janet
-
Ok so long time in getting back to this, but here's what the client was actually referring to and found a good post on optimization around the new approach. SEO Tips for Infinite Scrolling | Adam Sherk - excellent read! Wasn't sure exactly what this programmer/client was referring to, but this was it! Thanks all for the help!
-
I think what your developer is talking about, and what Facebook does, is the idea of all of your content being on one page. Progressive loading or "infinite scroll" is when you scroll down to the bottom of a page (like e.g. a category page on your blog) and more content loads on the page itself, as opposed to having to click to page 2 of results to view more content.
The problem with doing this is that even though the content continues to load on the same URL, it's being pulled from another place - so anything beyond that first set of content is being loaded by a JavaScript call. That means that search engines can't index that content that's being loaded as the user scrolls down - and users with JavaScript turned off also won't be able to view the rest of your content. This can be a big problem on main product level pages like your client is thinking of doing, since any links to other products that are beyond that initial page load won't be crawled by search engines.
If you're going to go the infinite scrolling/progressive loading route, make sure that when JavaScript is disabled, there's a crawlable "next page" link to a new, static URL for the next page of results. Basically, make sure that there's a more old-school "previous page/next page" environment with static page URLs that search engines and users without JavaScript can browse, in addition to your progressive loading page.
Here's a link to a similar question from last year that has more information: http://www.seomoz.org/q/infinite-scrolling-vs-pagination-on-an-ecommerce-site
-
That doesn't make much sense to me.
Just look at how Facebook loads... All the text pops right up, and then the images filter in. It just doesn't make any sense to me how it could 'exclude' anything.
Is there a way you could implement it on a small scale just to test it in the beginning? Maybe a page or two, or just a section of the site to start with... Then you would at least have some data to look at and help you make an informed decision.
I haven't been in this part of the world for very long, but I know that progressive loading isn't something that has popped up much in my research/reading. Even when I looked around (briefly) I didn't find anything that connected to SEO.
-
Thank you Modulusman - I thought so too - but the way the programmer was talking made it seem like it was some major exclusion of content or something.
Thanks for your input!
-
I may be wrong here... but isn't progressive loading mostly for images?
If this is what you're talking about.. I'm not sure how it would make much of a difference how things are indexed. It seems like "once upon a time" things had to be saved a certain way, but I'm not even sure that's the case anymore.
It may help with mobile conversion... depending on whether you're more focused on copy or media.
I know this isn't much, but maybe it will jog something for someone.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
URL Indexed But Not Submitted to Sitemap
Hi guys, In Google's webmaster tool it says that the URL has been indexed but not submitted to the sitemap. Is it necessary that the URL be submitted to the sitemap if it has already been indexed? Appreciate your help with this. Mark
Technical SEO | | marktheshark100 -
Google Not Indexing Submitted Images
Hi Guys! My question isn't too dissimilar to one asked a couple of years ago, regarding Google and image indexing, but having put my web address into a Google image search, I get a return of 15 images, so something isn't right. 5 months ago I submitted our 'new' site to Google webmaster. We have just moved it onto a Shopify platform. They (Shopify) are good at providing places to add titles and Alt tags and likewise we fill them in (so that box ticked!) However I have noticed over the last couple of months that despite 161 images being submitted, only 51 have been indexed. Furthermore and as I said earlier, when you put our site, site:http://www.hartnackandco.com into Google images, it only returns a total of 15 images. Any suggestions and help would be wonderful! Cheers Nick
Technical SEO | | nick_HandCo0 -
Incorrect rel canonical , impacts ?
Incorrect use of canonical code.. and why have they used the strange code surrounding it. Hi there seo guys, I need some help.. a site I am working on has used the rel canonical tag incorrectly. they have used the code on the cannon page not on the duplicate pages.. there is also some other strange code with it. I will show and hide the url.. However I wanted to know if this would stop google bots crawling this page correctly as they dont seem to rank very well either.. here is the code:
Technical SEO | | ibusmedia0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0