Infinite Scrolling: how to index all pictures
-
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/thank you
-
Hi Pete, I just wanted to confirm, based on what you wrote:
"I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.)."I did following:
- Deleted words "Home" and "Condo" from the title tag and H1 so the neighborhood name is still in title tag and H1, but no mention of home, condo, real estate etc.
- all written content has been moved from "guides" (where pictures and videos are) to lower part of MLS result pages and I imagine over a 1-2 month period the MLS result pages will get the SEO credit for this unique written content (despite no 301 redirect)
- I interlink from picture / video pages to MLS result pages with "neighborhood homes for sale"
My hypothesis is that over the next few months as G gets a better idea of my website (as the site gets more popular - still only 5 months old) G will know what to rank for "neighborhood homes for sale" search terms.
Makes sense?
-
Thats right. Zero search value. Maybe I can simply change Title tag, H1 etc. Get rid of keyword (ex "Honolulu") a d instead call ("Gallery 1"). In this way I can keep structure without diluting ranking potential for MLS result pages?
-
I generally wouldn't NOINDEX something that's part of your navigation structure, unless it's a deep layer (and you want to cut off anything "below" it). If you're concerned that they don't have search value, I'd consider consolidating somehow, which I thought was the general plan from the original question. I just don't know that you need all of the content or to get too complicated with the consolidation.
-
Interesting, thx. Can I do following: Add "noindex, follow" to those guide pages? In this way they wont compete w MLS result pages, which they currently do. Issue is all that geeat unique picture and video content wont be indexed by Google.....maybe not a big issue?
-
Yeah, I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.).
At the scale of a large site, it's hard to tell without understanding the data, including where your traffic is coming from. If it's producing value (traffic, links, etc.), great. If not, then you may want to revisit whether those pages are worth having and/or can be combined somehow. I don't think "combined" means everything on both pages gets put onto one mega-page - you could pick and choose at that point.
-
thx, Pete. Guides are more for users who are curious about pictures and videos - not something I care about ranking for. Ex: http://www.honoluluhi5.com/waikiki-condos-real-estate/
MLS result pages is my life and I moved a lot of written content to MLS result pages to add unique content. Ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ (you will see unique content below map and thumb MLS pictures).
I feel this layout is ideal long-term. I link from guide (as you can see above) to the corresponding MLS result page. Hope this makes sense
-
That depends on a lot of factors. Consolidating those to one page has advantages, SEO-wise, but you're losing the benefits of the photo page. I lean toward consolidation, but it really depends on how the pages are structured in the navigation, what sort of content and meta-data they have, etc. I'm not clear on what's left on Page A currently, but the biggest issue is probably dilution from the extra pages. Since there are "guide" pages, though, I'm not sure how they fit your site architecture. To remove 200 of them, you may need to also rethink your internal link structure.
-
thx a lot. "Viewing it as manipulative" - it makes sense. I will certainly refrain from doing so.
I keep saying last question, but this should be: moving some written content from Page A to Page B (yet keeping Page A, just less content remaining on Page A) is OK and will after a while be viewing as Page B's original content and Page B will get the SEO credit. This is done without a 301 re-direct, since Page A is still a page with pictures that are original and unique and I want Google to index all those pictures. Just that a bunch of unique written content was moved from Page A to Page B. I have moved written content from about 200 different guide type pages to 200 MLS result pages, as it makes more sense to have it there. Would it be safer to include the 301 re-direct and simply lose the picture indexing to play it safe?
-
That's a trick that used to occasionally work, but there's no evidence for it in the past couple of years. Google has gotten pretty good at understand how pages are rendered and is no longer completely dependent on source-code order. In some cases, they may even view it as manipulative.
-
thx. 1 last slight different, but related question: What is your view in placing written content above other content in source code, but on webpage written content displays below other content? In my case: MLS thumb pictures and descriptions (same as other realtors' websites) show on top of page and as users scroll down they see a lot of written unique original content I have. Search engines like written content higher on page, so would it be a good idea to place written content above the MLS data in the source code, though on webpage it will still display below MLS data.
-
I don't think the risk of harm, done right, is high, but: (1) it's easy to do wrong, and (2) I suspect the benefits are small at best. I think your time/money is better spent elsewhere.
-
thank you very much. The idea was to move a lot of great pictures from a "gallery" to a page I want to rank for. Gallery page serves no purpose but for users to see beautiful pictures and obviously for Google to index a lot of unique pictures. I guess I will leave the gallery as is and simply from the gallery inter-link to the important page.
Implementation on your suggestion can be done (my web developers have already completed, just not implemented), however, it sounds to me, if I read between the lines correctly, that there is a risk Google may screw up on interpretation of such implementation and this could potentially even hurt my site with duplicate content issues…….
-
By assigning a URL to each virtual "page", you allow Google to crawl the images, done correctly. What Google is suggesting is that you then set up rel=prev/next between those pages. This tells them to treat all of the image URLs as a paginated series (like a mutli-page article or search results).
My enterprise SEO friends have mixed feelings about rel=prev/next. The evidence of it's effectiveness is limited, but what it's supposed to do is allowing the individual pages (images, in this case) to rank while not looking like duplicate or near-duplicate content. The other options would be to rel=canonical these virtual pages, but then you'd essentially take the additional images out of ranking contention.
This infinite scroll + pagination approach is VERY technical and the implementation is well beyond Q&A's scope (it would take fairly in-depth knowledge of your site). Honestly, my gut reaction is that the time spent wouldn't be worth the gain. Most users won't know to scroll, and having 10-20 pictures vs. just a few may not add that much value. The SEO impact would be relatively small, I suspect. I think there may be easier solutions that would achieve 90% of your goals with a lot less complexity.
-
Hi Pete,
There is no mechanisim that will allow a) Lots of different pictures in a slideshow only to load when users scroll to a certain part of a part yet not slowing page speed and all pictures being indexed by Google. If you can show me 1 example on the Internet that has a solution to this, I would love to see it.This is what is possible to create (not my website, just an example): http://diveintohtml5.info/examples/history/brandy.html - I can implement such picture slideshow - which loads when users scroll down on my page - and then notice how the URL will change for each picture (as you change picture), but rest of the content on the page will stay the same. Now, the big questions go:
- Will the main (important) URL get the SEO credit for all these other URL's where each picture is located?
- Since each picture is on a different URL, each URL will get SEO credit separately and main URL will gain nothing from these pictures from an SEO perspective
- Since written content is EXACTLY the same across each of these picture URL's it will look like duplicate content and it would be good to use a canonical to make sure main URL gets all SEO credit.
- How would you place 20 unique copyrighted pictures on a URL and make sure that URL gets the SEO credit, keeping in mind the pictures can ONLY load after users scroll to a certain point on the page, as the page will otherwise load too slowly.
Highly appreciate your thoughts on this, since experts say there is a solution, but I am yet to seeing 1 concrete piece of evidence.
-
There should be no real difference, in terms of Google's infinite scroll solution. If you can chunk the content into pages with corresponding URLs, you can put any source code on those pages - text and/or images, along with corresponding alt text, etc. Once you've got one solution implemented, it should work for any kind of HTML. Not sure why images would be different in this case.
There are also ways to create photo galleries that can be crawled, mostly using AJAX. It's complex, but here's one example/discussion:
-
CORRECTION: URL 1 and URL 2 are the opposite of what I described. In other words, I want to move pictures from 1) to 2). I already moved written content from 1) to 2).
-
On this URL 1) http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/ - you will see written content at lower part of the page. This written content was originally on this URL 2) http://www.honoluluhi5.com/oahu/honolulu-homes/. I moved it because the URL 1) is the page I want to rank for and 2) served more as a guide. I want to move the pictures from 2) as well to 1) and then add a 301 redirect. However, this is NOT possible, because if I place pictures on 1) where users only see it after scrolling down to a certain place on the URL, Google is not able to index all those pictures. Only way to index those pictures is having them load when users land on the page, which would slow down the page and be a terrible user experience.
I am told there is a solution to get these pictures indexed, but so far no one has been able to present a concrete solution.
-
thank you, Pete.
- All images are my own and unique (ex: http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/)
- Infinite scrolling is what I am to use, otherwise loading will be too slow. Issue: When user scrolls and the pictures load, how do I set it so those images are indexed by Google? For written content it is easy to get the content indexed by Google with infinite scrolling. However, with images there seems to be no solution. In other words: if a URL has 10 images that only show after users scroll down to lower part of a given page, then those 10 images will not be indexed by Google and the page will not get the SEO credit. Any solution to this? These sources deals with the infinite scrolling and indexing issues, but does not apply to images:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/
-
Keep in mind that just adding 20 images/videos to this page isn't going to automatically increase the quality. Images have limited Google can crawl, and unless they're unique images that you own, they'll potentially be duplicated across the web. If adding those 20 images slows down the page a lot, that could actually harm your SEO and usability.
-
Unfortunately, it depends entirely on your implementation, but the short answer is that it depends if the images are loaded all at once and only displayed by scrolling or if they're loaded as you scroll. The latter is essentially what "infinite scrolling is" - it's generally not actually infinite, but scrolling will cause load events until there's nothing left to load.
The key is that the content has to be crawlable somehow and can't only be triggered by the event, or Google won't see it. So, if you're going to load as you go, the infinite scrolling posts should apply. If the images are pre-loaded, then you shouldn't have a problem, but I'd have to understand the implementation better.
-
I missed your point here. The page does not naturally suit for infinite scrolling in your opinion?
-
It's not an infinitely scrolling website. I'm going to drown myself now.
-
Travis: slight different, but related question: The written content you see at lower part of the URL I want to rank for, used to be on the other URL and I recently moved the content (no 301 redirect since I still have the pictures and video on the other URL). Will Google over time accept the unique content on the URL I want to rank for and credit that URL fully, OR will google notice the content originally was on the not important URL initially and therefore I risk the URL that now has the content will not get any credit for the content?
-
thx, Travis. The idea is not about being fancy: I do not want infinite scrolling. It comes down to me wanting to move a lot of great pictures and a video to this page that I want to rank for:
http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
…and here are the pictures and video: http://www.honoluluhi5.com/waikiki-condos-real-estate/The ladder page means nothing to me, except being nice pictures and video for the user. However, if I placed it under the written content on the 1st URL that would add extra "juice" of quality content to that page and I would long-term rank that much better. However, those pictures would tremendously slow loading and that is the issue……
-
I would say don't use infinite scrolling, not yet. A designer doesn't understand. They want everything to be fancy. Google isn't terribly ready for fancy yet.
At this point, I think infinite scroll is a horrible thing that needs to be shot in the face.
"Hey guys, let's load the entire site - all of the bells and whistles at once!"
That can be really mess with page load speed. So what about time to first byte? It doesn't matter if the first byte appears at the speed of light, if you're loading 450 MB.
If the Webmaster Central Blog didn't answer your question, you're pretty well on your own.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
Robots.txt, Disallow & Indexed-Pages..
Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!
Intermediate & Advanced SEO | | thekiller990 -
How to make sure dev site is not index in wordpress and how would it be affected?
hi guys! I'm currently having a dev version of my site (dev.website.com) that once everything is done i would move the dev to the public domain (website.com). But since is a total duplicate content of my real site would it affect the seo? if so, i tried setting the reading privacy in wordpress so google would not index it but im afraid when i live it in the future and revert the setting back to normal it would affect the site seo. any opinion and suggestion on this?
Intermediate & Advanced SEO | | andrewwatson920 -
Recovering from index problem (Take two)
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down! Below is my original message. Afterwards, I've added some update info. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. UPDATE OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week. Any ideas would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Duplicate Sub-domains Being Indexed
Hi all, I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated. Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors. My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
Intermediate & Advanced SEO | | EdwardDennis0 -
Why will google not index my pages?
About 6 weeks ago we moved a subcategory out to becomne a main category using all the same content. We also removed 100's of old products and replaced these with new variation listings to remove duplicate content issues. The problem is google will not index 12 critcal pages and our ranking have slumped for the keywords in the categories. What can i do to entice google to index these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
Same day indexing... some tips for a non blog site
Hi There's a question today about paying for Yahoo directory - I wondered if these kinds of links would help with getting indexed. We're adding about 10-20 pages of new unique content each day, but it takes google about 4 days to list it and then it only lists a few pages of that 10-20 pages, we have other older sites and blogs that are within the hour. What tips can you give me for getting at least a same day index, and a thorough one at that, so if we publish 20 pages on Thursday by the end of play Friday they're indexed? Would the Yahoo Directory help?
Intermediate & Advanced SEO | | xoffie0