How does Progressive Loading, aka what Facebook does, impact proper search indexation?
-
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue).
I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading?
If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know.
thanks a ton!!
Janet
-
Ok so long time in getting back to this, but here's what the client was actually referring to and found a good post on optimization around the new approach. SEO Tips for Infinite Scrolling | Adam Sherk - excellent read! Wasn't sure exactly what this programmer/client was referring to, but this was it! Thanks all for the help!
-
I think what your developer is talking about, and what Facebook does, is the idea of all of your content being on one page. Progressive loading or "infinite scroll" is when you scroll down to the bottom of a page (like e.g. a category page on your blog) and more content loads on the page itself, as opposed to having to click to page 2 of results to view more content.
The problem with doing this is that even though the content continues to load on the same URL, it's being pulled from another place - so anything beyond that first set of content is being loaded by a JavaScript call. That means that search engines can't index that content that's being loaded as the user scrolls down - and users with JavaScript turned off also won't be able to view the rest of your content. This can be a big problem on main product level pages like your client is thinking of doing, since any links to other products that are beyond that initial page load won't be crawled by search engines.
If you're going to go the infinite scrolling/progressive loading route, make sure that when JavaScript is disabled, there's a crawlable "next page" link to a new, static URL for the next page of results. Basically, make sure that there's a more old-school "previous page/next page" environment with static page URLs that search engines and users without JavaScript can browse, in addition to your progressive loading page.
Here's a link to a similar question from last year that has more information: http://www.seomoz.org/q/infinite-scrolling-vs-pagination-on-an-ecommerce-site
-
That doesn't make much sense to me.
Just look at how Facebook loads... All the text pops right up, and then the images filter in. It just doesn't make any sense to me how it could 'exclude' anything.
Is there a way you could implement it on a small scale just to test it in the beginning? Maybe a page or two, or just a section of the site to start with... Then you would at least have some data to look at and help you make an informed decision.
I haven't been in this part of the world for very long, but I know that progressive loading isn't something that has popped up much in my research/reading. Even when I looked around (briefly) I didn't find anything that connected to SEO.
-
Thank you Modulusman - I thought so too - but the way the programmer was talking made it seem like it was some major exclusion of content or something.
Thanks for your input!
-
I may be wrong here... but isn't progressive loading mostly for images?
If this is what you're talking about.. I'm not sure how it would make much of a difference how things are indexed. It seems like "once upon a time" things had to be saved a certain way, but I'm not even sure that's the case anymore.
It may help with mobile conversion... depending on whether you're more focused on copy or media.
I know this isn't much, but maybe it will jog something for someone.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
I Lost Index Status of My Sitemap
We have a simple WordPress website for our law firm, with an English version and a Spanish version. I have created a sitemap (with appropriate language markup in the XML file) and submitted it to Webmaster Tools. Google crawled the site and accepted the sitemap last week, 24/24 pages indexed, 12 English and 12 Spanish. This week, Google decided to remove one of the pages from the index, showing 23/24 pages indexed. So, my questions are as follows: How can I find out which page was dropped from the index? If the pages are the same content, but different language, why did only one version of the page get dropped, while the other version remains? Why did the Big G drop one of my pages from the index? How can I reindex the dropped page? I know this is a fairly basic issue, and I'm embarrassed for asking, but I sure do appreciate the help.
Technical SEO | | RLG0 -
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
Removing a staging area/dev area thats been indexed via GWT (since wasnt hidden) from the index
Hi, If you set up a brand new GWT account for a subdomain, where the dev area is located (separate from the main GWT account for the main live site) and remove all pages via the remove tool (by leaving the page field blank) will this definately not risk hurting/removing the main site (since the new subdomain specific gwt account doesn't apply to the main site in any way) ?? I have a new client who's dev area has been indexed, dev team has now prevented crawling of this subdomain but the 'the stable door was shut after the horse had already bolted' and the subdomains pages are on G's index so we need to remove the entire subdomain development area asap. So we are going to do this via the remove tool in a subdomain specific new gwt account, but I just want to triple check this wont accidentally get main site removed too ?? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Carwling and indexing problems
hi, i have noticed since my site was upgraded that google is taking a long time to publish my articles. before the upgrade google would publish the article straight away, but now it takes an average of around 4 days. the article i am talking about at the moment is here http://www.in2town.co.uk/celebrities-in-the-news/stuart-hall-has-his-prison-sentence-for-sex-crimes-doubled-to-30-months now i have a blog here on blogger and the article was picked up within six mins http://showbizgossipandnews.blogspot.co.uk/2013/07/stuart-hall-has-his-prison-sentence-for.html so i am just wondering what the problem is and what i need to solve this my problem is, my site is mostly a news site so it is no good to me if google is publishing new stories every four days, any help would be great.
Technical SEO | | ClaireH-1848860 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
Does hidden text, which appears for an onclick event, get indexed by Google and what SEO impact does this have?
I'm trying to simplify a conversion process with an onclick event to show text rather than having a completely separate page, but wondering if this is going to negatively impact on SEO, especially considering it's hidden text. I've seen a couple of things out there where you could position the text off the screen and the onclick results in it coming on.
Technical SEO | | JuiceBoxOM0 -
Duplicate content issue index.html vs non index.html
Hi I have an issue. In my client's profile, I found that the "index.html" are mostly authoritative than non "index.html", and I found that www. version is more authoritative than non www. The problem is that I find the opposite situation where non "index.html" are more authoritative than "index.html" or non www more authoritative than www. My logic would tell me to still redirect the non"index.html" to "index.html". Am I right? and in the case I find the opposite happening, does it matter if I still redirect the non"index.html" to "index.html"? The same question for www vs non www versions? Thank you
Technical SEO | | Ideas-Money-Art0