Infinite Scrolling: how to index all pictures
-
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/thank you
-
Hi Pete, I just wanted to confirm, based on what you wrote:
"I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.)."I did following:
- Deleted words "Home" and "Condo" from the title tag and H1 so the neighborhood name is still in title tag and H1, but no mention of home, condo, real estate etc.
- all written content has been moved from "guides" (where pictures and videos are) to lower part of MLS result pages and I imagine over a 1-2 month period the MLS result pages will get the SEO credit for this unique written content (despite no 301 redirect)
- I interlink from picture / video pages to MLS result pages with "neighborhood homes for sale"
My hypothesis is that over the next few months as G gets a better idea of my website (as the site gets more popular - still only 5 months old) G will know what to rank for "neighborhood homes for sale" search terms.
Makes sense?
-
Thats right. Zero search value. Maybe I can simply change Title tag, H1 etc. Get rid of keyword (ex "Honolulu") a d instead call ("Gallery 1"). In this way I can keep structure without diluting ranking potential for MLS result pages?
-
I generally wouldn't NOINDEX something that's part of your navigation structure, unless it's a deep layer (and you want to cut off anything "below" it). If you're concerned that they don't have search value, I'd consider consolidating somehow, which I thought was the general plan from the original question. I just don't know that you need all of the content or to get too complicated with the consolidation.
-
Interesting, thx. Can I do following: Add "noindex, follow" to those guide pages? In this way they wont compete w MLS result pages, which they currently do. Issue is all that geeat unique picture and video content wont be indexed by Google.....maybe not a big issue?
-
Yeah, I don't think the picture- and video-heavy pages are going to rank all that well by themselves. It's just a question of whether those additional pages are diluting your MLS listing pages (by using similar regional keywords, etc.).
At the scale of a large site, it's hard to tell without understanding the data, including where your traffic is coming from. If it's producing value (traffic, links, etc.), great. If not, then you may want to revisit whether those pages are worth having and/or can be combined somehow. I don't think "combined" means everything on both pages gets put onto one mega-page - you could pick and choose at that point.
-
thx, Pete. Guides are more for users who are curious about pictures and videos - not something I care about ranking for. Ex: http://www.honoluluhi5.com/waikiki-condos-real-estate/
MLS result pages is my life and I moved a lot of written content to MLS result pages to add unique content. Ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ (you will see unique content below map and thumb MLS pictures).
I feel this layout is ideal long-term. I link from guide (as you can see above) to the corresponding MLS result page. Hope this makes sense
-
That depends on a lot of factors. Consolidating those to one page has advantages, SEO-wise, but you're losing the benefits of the photo page. I lean toward consolidation, but it really depends on how the pages are structured in the navigation, what sort of content and meta-data they have, etc. I'm not clear on what's left on Page A currently, but the biggest issue is probably dilution from the extra pages. Since there are "guide" pages, though, I'm not sure how they fit your site architecture. To remove 200 of them, you may need to also rethink your internal link structure.
-
thx a lot. "Viewing it as manipulative" - it makes sense. I will certainly refrain from doing so.
I keep saying last question, but this should be: moving some written content from Page A to Page B (yet keeping Page A, just less content remaining on Page A) is OK and will after a while be viewing as Page B's original content and Page B will get the SEO credit. This is done without a 301 re-direct, since Page A is still a page with pictures that are original and unique and I want Google to index all those pictures. Just that a bunch of unique written content was moved from Page A to Page B. I have moved written content from about 200 different guide type pages to 200 MLS result pages, as it makes more sense to have it there. Would it be safer to include the 301 re-direct and simply lose the picture indexing to play it safe?
-
That's a trick that used to occasionally work, but there's no evidence for it in the past couple of years. Google has gotten pretty good at understand how pages are rendered and is no longer completely dependent on source-code order. In some cases, they may even view it as manipulative.
-
thx. 1 last slight different, but related question: What is your view in placing written content above other content in source code, but on webpage written content displays below other content? In my case: MLS thumb pictures and descriptions (same as other realtors' websites) show on top of page and as users scroll down they see a lot of written unique original content I have. Search engines like written content higher on page, so would it be a good idea to place written content above the MLS data in the source code, though on webpage it will still display below MLS data.
-
I don't think the risk of harm, done right, is high, but: (1) it's easy to do wrong, and (2) I suspect the benefits are small at best. I think your time/money is better spent elsewhere.
-
thank you very much. The idea was to move a lot of great pictures from a "gallery" to a page I want to rank for. Gallery page serves no purpose but for users to see beautiful pictures and obviously for Google to index a lot of unique pictures. I guess I will leave the gallery as is and simply from the gallery inter-link to the important page.
Implementation on your suggestion can be done (my web developers have already completed, just not implemented), however, it sounds to me, if I read between the lines correctly, that there is a risk Google may screw up on interpretation of such implementation and this could potentially even hurt my site with duplicate content issues…….
-
By assigning a URL to each virtual "page", you allow Google to crawl the images, done correctly. What Google is suggesting is that you then set up rel=prev/next between those pages. This tells them to treat all of the image URLs as a paginated series (like a mutli-page article or search results).
My enterprise SEO friends have mixed feelings about rel=prev/next. The evidence of it's effectiveness is limited, but what it's supposed to do is allowing the individual pages (images, in this case) to rank while not looking like duplicate or near-duplicate content. The other options would be to rel=canonical these virtual pages, but then you'd essentially take the additional images out of ranking contention.
This infinite scroll + pagination approach is VERY technical and the implementation is well beyond Q&A's scope (it would take fairly in-depth knowledge of your site). Honestly, my gut reaction is that the time spent wouldn't be worth the gain. Most users won't know to scroll, and having 10-20 pictures vs. just a few may not add that much value. The SEO impact would be relatively small, I suspect. I think there may be easier solutions that would achieve 90% of your goals with a lot less complexity.
-
Hi Pete,
There is no mechanisim that will allow a) Lots of different pictures in a slideshow only to load when users scroll to a certain part of a part yet not slowing page speed and all pictures being indexed by Google. If you can show me 1 example on the Internet that has a solution to this, I would love to see it.This is what is possible to create (not my website, just an example): http://diveintohtml5.info/examples/history/brandy.html - I can implement such picture slideshow - which loads when users scroll down on my page - and then notice how the URL will change for each picture (as you change picture), but rest of the content on the page will stay the same. Now, the big questions go:
- Will the main (important) URL get the SEO credit for all these other URL's where each picture is located?
- Since each picture is on a different URL, each URL will get SEO credit separately and main URL will gain nothing from these pictures from an SEO perspective
- Since written content is EXACTLY the same across each of these picture URL's it will look like duplicate content and it would be good to use a canonical to make sure main URL gets all SEO credit.
- How would you place 20 unique copyrighted pictures on a URL and make sure that URL gets the SEO credit, keeping in mind the pictures can ONLY load after users scroll to a certain point on the page, as the page will otherwise load too slowly.
Highly appreciate your thoughts on this, since experts say there is a solution, but I am yet to seeing 1 concrete piece of evidence.
-
There should be no real difference, in terms of Google's infinite scroll solution. If you can chunk the content into pages with corresponding URLs, you can put any source code on those pages - text and/or images, along with corresponding alt text, etc. Once you've got one solution implemented, it should work for any kind of HTML. Not sure why images would be different in this case.
There are also ways to create photo galleries that can be crawled, mostly using AJAX. It's complex, but here's one example/discussion:
-
CORRECTION: URL 1 and URL 2 are the opposite of what I described. In other words, I want to move pictures from 1) to 2). I already moved written content from 1) to 2).
-
On this URL 1) http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/ - you will see written content at lower part of the page. This written content was originally on this URL 2) http://www.honoluluhi5.com/oahu/honolulu-homes/. I moved it because the URL 1) is the page I want to rank for and 2) served more as a guide. I want to move the pictures from 2) as well to 1) and then add a 301 redirect. However, this is NOT possible, because if I place pictures on 1) where users only see it after scrolling down to a certain place on the URL, Google is not able to index all those pictures. Only way to index those pictures is having them load when users land on the page, which would slow down the page and be a terrible user experience.
I am told there is a solution to get these pictures indexed, but so far no one has been able to present a concrete solution.
-
thank you, Pete.
- All images are my own and unique (ex: http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/)
- Infinite scrolling is what I am to use, otherwise loading will be too slow. Issue: When user scrolls and the pictures load, how do I set it so those images are indexed by Google? For written content it is easy to get the content indexed by Google with infinite scrolling. However, with images there seems to be no solution. In other words: if a URL has 10 images that only show after users scroll down to lower part of a given page, then those 10 images will not be indexed by Google and the page will not get the SEO credit. Any solution to this? These sources deals with the infinite scrolling and indexing issues, but does not apply to images:
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/
-
Keep in mind that just adding 20 images/videos to this page isn't going to automatically increase the quality. Images have limited Google can crawl, and unless they're unique images that you own, they'll potentially be duplicated across the web. If adding those 20 images slows down the page a lot, that could actually harm your SEO and usability.
-
Unfortunately, it depends entirely on your implementation, but the short answer is that it depends if the images are loaded all at once and only displayed by scrolling or if they're loaded as you scroll. The latter is essentially what "infinite scrolling is" - it's generally not actually infinite, but scrolling will cause load events until there's nothing left to load.
The key is that the content has to be crawlable somehow and can't only be triggered by the event, or Google won't see it. So, if you're going to load as you go, the infinite scrolling posts should apply. If the images are pre-loaded, then you shouldn't have a problem, but I'd have to understand the implementation better.
-
I missed your point here. The page does not naturally suit for infinite scrolling in your opinion?
-
It's not an infinitely scrolling website. I'm going to drown myself now.
-
Travis: slight different, but related question: The written content you see at lower part of the URL I want to rank for, used to be on the other URL and I recently moved the content (no 301 redirect since I still have the pictures and video on the other URL). Will Google over time accept the unique content on the URL I want to rank for and credit that URL fully, OR will google notice the content originally was on the not important URL initially and therefore I risk the URL that now has the content will not get any credit for the content?
-
thx, Travis. The idea is not about being fancy: I do not want infinite scrolling. It comes down to me wanting to move a lot of great pictures and a video to this page that I want to rank for:
http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
…and here are the pictures and video: http://www.honoluluhi5.com/waikiki-condos-real-estate/The ladder page means nothing to me, except being nice pictures and video for the user. However, if I placed it under the written content on the 1st URL that would add extra "juice" of quality content to that page and I would long-term rank that much better. However, those pictures would tremendously slow loading and that is the issue……
-
I would say don't use infinite scrolling, not yet. A designer doesn't understand. They want everything to be fancy. Google isn't terribly ready for fancy yet.
At this point, I think infinite scroll is a horrible thing that needs to be shot in the face.
"Hey guys, let's load the entire site - all of the bells and whistles at once!"
That can be really mess with page load speed. So what about time to first byte? It doesn't matter if the first byte appears at the speed of light, if you're loading 450 MB.
If the Webmaster Central Blog didn't answer your question, you're pretty well on your own.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
Should I exclude my knowledge center subdomain from indexing?
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing? Thank you
Intermediate & Advanced SEO | | NikCall2 -
Password Protected Page(s) Indexed
Hi, I am wondering if my website can get a penalty if some password protected pages are showing up when I search on google: site:www.example.com/sub-group/pass-word-protected-page That shows that my password protected page was indexed either before or after adding the password protection. I've seen people suggest no indexing the page. Is that the best method to take care of this? What if we are planning on pushing the page live later on? All of these pages have no title tag, meta description, image alt text, etc. Should I add them for each page? I am wondering what is the best step, especially if we are planning on pushing the page(s) live. Thanks for any help!
Intermediate & Advanced SEO | | aua0 -
Google not indexing images
Hi there, We have a strange issue at a client website (www.rubbermagazijn.nl). Webpage are indexed by Google but images are not, and have never been since the site went live in '12 (We recently started SEO work on this client). Similar sites like www.damenrubber.nl are being indexed correctly. We have correct robots and sitemap setup and directions. Fetch as google (Search Console) shows all images displayed correctly (despite scripted mouseover on the page) Client doesn't use CDN Search console shows 2k images indexed (out of 18k+) but a site:rubbermagazijn.nl query shows a couple of images from PDF files and some of the thumbnails, but no productimages or category images from homepage. (product page example: http://www.rubbermagazijn.nl/collectie/slangen/olie-benzineslangen/7703_zwart_nbr-oliebestendig-6mm-l-1000mm.html) We've changed the filenames from non-descriptive names to descriptive names, without any result. Descriptive alt texts were added We're at a loss. Has anyone encountered a similar issue before, and do you have any advice? I'd be happy to provide more information if needed. CBqqw
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
What to do when your home page an index for a series of pages.
I have created an index stack. My home page is http://www.southernwhitewater.com The home page is the index itself and the 1st page http://www.southernwhitewater.com/nz-adventure-tours-whitewater-river-rafting-hunting-fishing My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top to a picture. rel= canonical? Any help would be great!! http://www.southernwhitewater.com
Intermediate & Advanced SEO | | VelocityWebsites0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Google Not Indexing Description or correct title (very technical)
Hey guys, I am managing the site: http://www.theattractionforums.com/ If you search the keyword "PUA Forums", it will be in the top 10 results, however the title of the forum will be "PUA Forums" rather than using the code in the title tag, and no description will display at all (despite there being one in the code). Any page other than the home-page that ranks shows the correct title and description. We're completely baffled! Here are some interesting bits and pieces: It shows up fine on Bing If I go into GWT and Fetch as Google Bot, it shows up as "Unreachable" when I try to pull the home-page. We previously found that it was pulling 'index.htm' before 'index.php' - and this was pulling a blank page. I've fixed this in the .htaccess however to make it redirect, however this hasn't solved the problem. I've disallowed it from pulling the description .etc from the Open Directory with the use of meta tags - didn't change anything. It's vBulletin and is running vBSEO Any suggestions at all guys? I'll be forever in anyones debt who can solve this, it's proving to be near impossible to fix. Here is the .htaccess file, it may be a part of the issue: RewriteEngine On DirectoryIndex index.php index.html Redirect /index.html http://www.theattractionforums.com/index.php RewriteCond %{HTTP_HOST} !^www.theattractionforums.com
Intermediate & Advanced SEO | | trx
RewriteRule (.*) http://www.theattractionforums.com/$1 [L,R=301] RewriteRule ^((urllist|sitemap_).*.(xml|txt)(.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L] RewriteCond %{REQUEST_URI} !(admincp/|modcp/|cron|vbseo_sitemap/)
RewriteRule ^((archive/)?(..php(/.)?)?)$ vbseo.php [L,QSA] RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !^(admincp|modcp|clientscript|cpstyles|images)/
RewriteRule ^(.+)$ vbseo.php [L,QSA]
RewriteRule ^forum/(.*)$ http://www.theattractionforums.com/$1 [R=301,L]0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0