Infinite Scrolling: how to index all pictures
-
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/thank you
-
Hi Pete, I just wanted to confirm, based on what you wrote:
"I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.)."I did following:
- Deleted words "Home" and "Condo" from the title tag and H1 so the neighborhood name is still in title tag and H1, but no mention of home, condo, real estate etc.
- all written content has been moved from "guides" (where pictures and videos are) to lower part of MLS result pages and I imagine over a 1-2 month period the MLS result pages will get the SEO credit for this unique written content (despite no 301 redirect)
- I interlink from picture / video pages to MLS result pages with "neighborhood homes for sale"
My hypothesis is that over the next few months as G gets a better idea of my website (as the site gets more popular - still only 5 months old) G will know what to rank for "neighborhood homes for sale" search terms.
Makes sense?
-
Thats right. Zero search value. Maybe I can simply change Title tag, H1 etc. Get rid of keyword (ex "Honolulu") a d instead call ("Gallery 1"). In this way I can keep structure without diluting ranking potential for MLS result pages?
-
I generally wouldn't NOINDEX something that's part of your navigation structure, unless it's a deep layer (and you want to cut off anything "below" it). If you're concerned that they don't have search value, I'd consider consolidating somehow, which I thought was the general plan from the original question. I just don't know that you need all of the content or to get too complicated with the consolidation.
-
Interesting, thx. Can I do following: Add "noindex, follow" to those guide pages? In this way they wont compete w MLS result pages, which they currently do. Issue is all that geeat unique picture and video content wont be indexed by Google.....maybe not a big issue?
-
Yeah, I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.).
At the scale of a large site, it's hard to tell without understanding the data, including where your traffic is coming from. If it's producing value (traffic, links, etc.), great. If not, then you may want to revisit whether those pages are worth having and/or can be combined somehow. I don't think "combined" means everything on both pages gets put onto one mega-page - you could pick and choose at that point.
-
thx, Pete. Guides are more for users who are curious about pictures and videos - not something I care about ranking for. Ex: http://www.honoluluhi5.com/waikiki-condos-real-estate/
MLS result pages is my life and I moved a lot of written content to MLS result pages to add unique content. Ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ (you will see unique content below map and thumb MLS pictures).
I feel this layout is ideal long-term. I link from guide (as you can see above) to the corresponding MLS result page. Hope this makes sense
-
That depends on a lot of factors. Consolidating those to one page has advantages, SEO-wise, but you're losing the benefits of the photo page. I lean toward consolidation, but it really depends on how the pages are structured in the navigation, what sort of content and meta-data they have, etc. I'm not clear on what's left on Page A currently, but the biggest issue is probably dilution from the extra pages. Since there are "guide" pages, though, I'm not sure how they fit your site architecture. To remove 200 of them, you may need to also rethink your internal link structure.
-
thx a lot. "Viewing it as manipulative" - it makes sense. I will certainly refrain from doing so.
I keep saying last question, but this should be: moving some written content from Page A to Page B (yet keeping Page A, just less content remaining on Page A) is OK and will after a while be viewing as Page B's original content and Page B will get the SEO credit. This is done without a 301 re-direct, since Page A is still a page with pictures that are original and unique and I want Google to index all those pictures. Just that a bunch of unique written content was moved from Page A to Page B. I have moved written content from about 200 different guide type pages to 200 MLS result pages, as it makes more sense to have it there. Would it be safer to include the 301 re-direct and simply lose the picture indexing to play it safe?
-
That's a trick that used to occasionally work, but there's no evidence for it in the past couple of years. Google has gotten pretty good at understand how pages are rendered and is no longer completely dependent on source-code order. In some cases, they may even view it as manipulative.
-
thx. 1 last slight different, but related question: What is your view in placing written content above other content in source code, but on webpage written content displays below other content? In my case: MLS thumb pictures and descriptions (same as other realtors' websites) show on top of page and as users scroll down they see a lot of written unique original content I have. Search engines like written content higher on page, so would it be a good idea to place written content above the MLS data in the source code, though on webpage it will still display below MLS data.
-
I don't think the risk of harm, done right, is high, but: (1) it's easy to do wrong, and (2) I suspect the benefits are small at best. I think your time/money is better spent elsewhere.
-
thank you very much. The idea was to move a lot of great pictures from a "gallery" to a page I want to rank for. Gallery page serves no purpose but for users to see beautiful pictures and obviously for Google to index a lot of unique pictures. I guess I will leave the gallery as is and simply from the gallery inter-link to the important page.
Implementation on your suggestion can be done (my web developers have already completed, just not implemented), however, it sounds to me, if I read between the lines correctly, that there is a risk Google may screw up on interpretation of such implementation and this could potentially even hurt my site with duplicate content issues…….
-
By assigning a URL to each virtual "page", you allow Google to crawl the images, done correctly. What Google is suggesting is that you then set up rel=prev/next between those pages. This tells them to treat all of the image URLs as a paginated series (like a mutli-page article or search results).
My enterprise SEO friends have mixed feelings about rel=prev/next. The evidence of it's effectiveness is limited, but what it's supposed to do is allowing the individual pages (images, in this case) to rank while not looking like duplicate or near-duplicate content. The other options would be to rel=canonical these virtual pages, but then you'd essentially take the additional images out of ranking contention.
This infinite scroll + pagination approach is VERY technical and the implementation is well beyond Q&A's scope (it would take fairly in-depth knowledge of your site). Honestly, my gut reaction is that the time spent wouldn't be worth the gain. Most users won't know to scroll, and having 10-20 pictures vs. just a few may not add that much value. The SEO impact would be relatively small, I suspect. I think there may be easier solutions that would achieve 90% of your goals with a lot less complexity.
-
Hi Pete,
There is no mechanisim that will allow a) Lots of different pictures in a slideshow only to load when users scroll to a certain part of a part yet not slowing page speed and all pictures being indexed by Google. If you can show me 1 example on the Internet that has a solution to this, I would love to see it.This is what is possible to create (not my website, just an example): http://diveintohtml5.info/examples/history/brandy.html - I can implement such picture slideshow - which loads when users scroll down on my page - and then notice how the URL will change for each picture (as you change picture), but rest of the content on the page will stay the same. Now, the big questions go:
- Will the main (important) URL get the SEO credit for all these other URL's where each picture is located?
- Since each picture is on a different URL, each URL will get SEO credit separately and main URL will gain nothing from these pictures from an SEO perspective
- Since written content is EXACTLY the same across each of these picture URL's it will look like duplicate content and it would be good to use a canonical to make sure main URL gets all SEO credit.
- How would you place 20 unique copyrighted pictures on a URL and make sure that URL gets the SEO credit, keeping in mind the pictures can ONLY load after users scroll to a certain point on the page, as the page will otherwise load too slowly.
Highly appreciate your thoughts on this, since experts say there is a solution, but I am yet to seeing 1 concrete piece of evidence.
-
There should be no real difference, in terms of Google's infinite scroll solution. If you can chunk the content into pages with corresponding URLs, you can put any source code on those pages - text and/or images, along with corresponding alt text, etc. Once you've got one solution implemented, it should work for any kind of HTML. Not sure why images would be different in this case.
There are also ways to create photo galleries that can be crawled, mostly using AJAX. It's complex, but here's one example/discussion:
-
CORRECTION: URL 1 and URL 2 are the opposite of what I described. In other words, I want to move pictures from 1) to 2). I already moved written content from 1) to 2).
-
On this URL 1) http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/ - you will see written content at lower part of the page. This written content was originally on this URL 2) http://www.honoluluhi5.com/oahu/honolulu-homes/. I moved it because the URL 1) is the page I want to rank for and 2) served more as a guide. I want to move the pictures from 2) as well to 1) and then add a 301 redirect. However, this is NOT possible, because if I place pictures on 1) where users only see it after scrolling down to a certain place on the URL, Google is not able to index all those pictures. Only way to index those pictures is having them load when users land on the page, which would slow down the page and be a terrible user experience.
I am told there is a solution to get these pictures indexed, but so far no one has been able to present a concrete solution.
-
thank you, Pete.
- All images are my own and unique (ex: http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/)
- Infinite scrolling is what I am to use, otherwise loading will be too slow. Issue: When user scrolls and the pictures load, how do I set it so those images are indexed by Google? For written content it is easy to get the content indexed by Google with infinite scrolling. However, with images there seems to be no solution. In other words: if a URL has 10 images that only show after users scroll down to lower part of a given page, then those 10 images will not be indexed by Google and the page will not get the SEO credit. Any solution to this? These sources deals with the infinite scrolling and indexing issues, but does not apply to images:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/
-
Keep in mind that just adding 20 images/videos to this page isn't going to automatically increase the quality. Images have limited Google can crawl, and unless they're unique images that you own, they'll potentially be duplicated across the web. If adding those 20 images slows down the page a lot, that could actually harm your SEO and usability.
-
Unfortunately, it depends entirely on your implementation, but the short answer is that it depends if the images are loaded all at once and only displayed by scrolling or if they're loaded as you scroll. The latter is essentially what "infinite scrolling is" - it's generally not actually infinite, but scrolling will cause load events until there's nothing left to load.
The key is that the content has to be crawlable somehow and can't only be triggered by the event, or Google won't see it. So, if you're going to load as you go, the infinite scrolling posts should apply. If the images are pre-loaded, then you shouldn't have a problem, but I'd have to understand the implementation better.
-
I missed your point here. The page does not naturally suit for infinite scrolling in your opinion?
-
It's not an infinitely scrolling website. I'm going to drown myself now.
-
Travis: slight different, but related question: The written content you see at lower part of the URL I want to rank for, used to be on the other URL and I recently moved the content (no 301 redirect since I still have the pictures and video on the other URL). Will Google over time accept the unique content on the URL I want to rank for and credit that URL fully, OR will google notice the content originally was on the not important URL initially and therefore I risk the URL that now has the content will not get any credit for the content?
-
thx, Travis. The idea is not about being fancy: I do not want infinite scrolling. It comes down to me wanting to move a lot of great pictures and a video to this page that I want to rank for:
http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
…and here are the pictures and video: http://www.honoluluhi5.com/waikiki-condos-real-estate/The ladder page means nothing to me, except being nice pictures and video for the user. However, if I placed it under the written content on the 1st URL that would add extra "juice" of quality content to that page and I would long-term rank that much better. However, those pictures would tremendously slow loading and that is the issue……
-
I would say don't use infinite scrolling, not yet. A designer doesn't understand. They want everything to be fancy. Google isn't terribly ready for fancy yet.
At this point, I think infinite scroll is a horrible thing that needs to be shot in the face.
"Hey guys, let's load the entire site - all of the bells and whistles at once!"
That can be really mess with page load speed. So what about time to first byte? It doesn't matter if the first byte appears at the speed of light, if you're loading 450 MB.
If the Webmaster Central Blog didn't answer your question, you're pretty well on your own.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index Bloat: Canonicalize, Redirect or Delete URLs?
I was doing some simple on-page recommendations for a client and realized that they have a bit of a website bloat problem. They are an ecommerce shoe store and for one product, there could be 10+ URLs. For example, this is what ONE product looks like: example.com/products/shoename-color1 example.com/products/shoename-color2 example.com/collections/style/products/shoename-color1 example.com/collections/style/products/shoename-color2 example.com/collections/adifferentstyle/products/shoename-color1 example.com/collections/adifferentstyle/products/shoename-color2 example.com/collections/shop-latest-styles/products/shoename-color1 example.com/collections/shop-latest-styles/products/shoename-color2 example.com/collections/all/products/shoename-color1 example.com/collections/all/products/shoename-color2 ...and so on... all for the same shoe. They have about 20-30 shoes altogether, and some come in 4-5 colors. This has caused some major bloat on their site and I assume some confusion for the search engine. That said, I'm trying to figure out what the best way to tackle this is from an SEO perspective. Here's where I've gotten to so far: Is it better to canonicalize all URLs, referencing back to one "main" one, delete all bloat pages re-link everything to the main one(s), or 301 redirect the bloat URLs back to the "main" one(s)? Or is there another option that I haven't considered? Thanks!
Intermediate & Advanced SEO | | AJTSEO0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Duplicated Content with Index.php
Good Afternoon, My website uses Joomla CMS and has the htaccess rewrite code enabled to ensure the use of search engine friendly URLs (SEF's). While browsing the crawl diagnostics I have found that Moz considers the /index.php URL a duplicate to our root. I will always under the impression that the htaccess rewrite took care of that issue and obviously I would like to address it. I attempted to create a 301 redirect from the index.php URL to the root but ran into an issue when attempting to login to the admin portion of the website as the redirect sent me back to the homepage. I was curious if anyone had advice for handling the index.php duplication issue, specifically with Joomla. Additionally, I have confirmed that in Google Webmasters, under URL parameters, the index.php parameter is set as 'Representative URL'.
Intermediate & Advanced SEO | | BrandonEML0 -
Which URLs were indexed 2 years ago?
Hi, I hope anyone can help me with this issue. Our french domain experienced a huge drop of indexed URLs in 2012. More than 50k URLs were indexed, after the drop less than 10k were counted. I would like to check what happened here and which URLs were thrown out of the index. So I was thinking about a comparison between todays data and the data of 2012. Unfortunately we don't have any data on the indexed pages in 2012 beside the number of indexed pages. Is there any way to check, which URLs were indexed 2 years ago?
Intermediate & Advanced SEO | | Sandra_h0 -
New Web Page Not Indexed
Quick question with probably a straightforward answer... We created a new page on our site 4 days ago, it was in fact a mini-site page though I don't think that makes a difference... To date, the page is not indexed and when I use 'Fetch as Google' in WT I get a 'Not Found' fetch status... I have also used the'Submit URL' in WT which seemed to work ok... We have even resorted to 'pinging' using Pinglar and Ping-O-Matic though we have done this cautiously! I know social media is probably the answer but we have been trying to hold back on that tactic as the page relates to a product that hasn't quite launched yet and we do not want to cause any issues with the vendor! That said, I think we might have to look at sharing the page socially unless anyone has any other ideas? Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Indexing based on location
Hi everybody. Bit of an interesting question. I have a client that wants to have the following pages on their site indexed: example.fr/home.html on Google.fr for users based in France
Intermediate & Advanced SEO | | Blink-SEO
example.com/fr/home.html on Google.com for users not based in France. So they wish to have both pages indexed in the end but not displayed to the same geographic users. Not entirely sure the best way to go about this, so any tips would be much appreciated!0 -
Site Search Results in Index -- Help
Hi, I made a mistake on my site, long story short, I have a bunch of search results page in the Google index. (I made a navigation page full of common search terms, and made internal links to a respective search results page for each common search term.) Google crawled the site, saw the links and now those search results pages are indexed. I made versions of the indexed search results pages into proper category pages with good URLs and am ready to go live/ replace the pages and links. But, I am a little unsure how to do it /what the effects can be: Will there be duplicate content issues if I just replace the bad, search results links/URLs with the good, category page links/URLs on the navi. page? (is a short term risk worth it?) Should I get the search results pages de-indexed first and then relaunch the navi. page with the correct category URLs? Should I do a robots.txt disallow directive for search results? Should I use Google's URL removal tool to remove those indexed search results pages for a quick fix, or will this cause more harm than good? Time is not the biggest issue, I want to do it right, because those indexed search results pages do attract traffic and the navi. page has been great for usability. Any suggestions would be great. I have been reading a ton on this topic, but maybe someone can give me more specific advice. Thanks in advance, hopefully this all makes sense.
Intermediate & Advanced SEO | | IOSC1 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0