How does Infinite Scrolling work with unique URLS as users scroll down? And is this SEO friendly?
-
I was on a site today and as i scrolled down and viewed the other posts that were below the top one i read, i noticed that each post below the top one had its own unique URL. I have not seen this and was curious if this method of infinite scrolling is SEO friendly. Will Google's spiders scroll down and index these posts below the top one and index them? The URLs of these lower posts by the way were the same URLs that would be seen if i clicked on each of these posts. Looking at Google's preferred method for Infinite scrolling they recommend something different - https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html .
Welcome all insight. Thanks!
Christian
-
Thx again!!
-
Yes! You asked "So if I understand correctly then Google will index just the 1st post then?" and there's no way of guaranteeing what Google will or won't do. But that is probably what will happen.
-
each of the lower posts does have its own URL. As you noted above, that unique URL does show up as the user scrolls lower, but there are links to these URLs from main nav too.
-
Google will probably only count the content of the first post (or however much content displays at initial page load time) when ranking and indexing that infinite-scroll page, yes, so if you want the rest of that content in the index I'd give it its own URLs. However, Google is getting better at JavaScript and is always unpredictable, so it's not beyond the realm of possibility that it would index more content from the infinite scroll page than initially loads - don't be too surprised if you see that, but I wouldn't count on it.
-
Thanks Ruth! Greatly appreciate your help.
So if I understand correctly then Google will index just the 1st post then? Since the lower posts all have their own unique urls then Google will just index those as it crawls I assume (of course it's always wise to have a site map).
-
Hi Christian,
What you're seeing is exactly what Google recommends for infinite scroll in the resource you link to. It breaks the page up into component resources (separate URLs) each of which could be accessed on its own. Their examples use dynamic parameters to break up into e.g. page=2, but if your infinite- or long-scrolling page isn't paginated content, there's no reason why each component couldn't have its own URL that is accessed as you scroll down.
I actually really like this method as a compromise between the "one long page with all the information on it" approach to web design and the "landing pages for people looking for specific bits of information" approach to SEO. For example, I often have SAAS clients who want all the information about what their product does to be one one long page. This is great for people who want to research the whole product at once, but makes it hard for me to optimize for keywords pertaining to individual features of the product. The solution is to have separate landing pages that talk about specific features, all linked together in one "product" page that scrolls using the methodology outlined in the Google resource you linked to. Plus, it means that people who are just looking for that one feature arrive on a page that's about that feature, instead of having to scroll to find what they're looking for.
With the infinite scroll situation, Google is only usually going to crawl and index what is available to the user before more of the page loads - so if you want Google to crawl and index all of the content on your infinite-scroll page, this is the way to do it. It's also better for users who don't have JavaScript enabled. I hope that makes sense and let me know if you have more questions!
-
Check pymnts.com
-
I regret I have not understood the question, what do you mean with "unique urls"? Can you post a link to show that website?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Friendly Files Redirected From Images
I have images (.jpg's) of products that when you click them redirect you to a .pdf's containing all the products' specs, patterns, colors, etc. These are 302 redirects that open on a different window when clicked on. Is there a way to keep these redirects and maintain SEO optimization? Any advice is appreciated.
Intermediate & Advanced SEO | | SuperiorPavers0 -
After Ranking Drop Continue SEO or Focus on Improving User Experience Instead?
Six months after starting a marketing campaign and spending a lot of money on SEO audits, link removals, wire frames, copywriting and coding my web site (www.nyc-officespace-leader.com) traffic dropped significantly after I launched a new version of my site in early June. Traffic is down about 27%, but most of the traffic from competitive terms is gone and the number of leads (phone calls, form completions) is off by about 70%. On june 6th an upgraded version of the site with mostly cosmetic changes (narrower header without social media buttons, streamlined conversion forms, new right rail was launched. No URLs were changed, and the text remained mostly the same. But somehow my developers botched up either canonical tags or Robot Text and 175 URLs with very little/no content were indexed by Google. At that point my ranking and traffic. A few days ago a request to remove those pages was made via Google WebmasterTools and now the number of pages indexed is down to 675 rather than the incorrect 850 from before. But ranking, traffic and lead generation have not yet recovered. After spending almost $25,000 over nine months this is rather frustrating. I might add the site has very few links from incoming domains and those links are not high quality. An SEO audit was performed in February and in April a link removal campaign occurred with about 30 domains agreeing to remove links and a disavow file being submitted for another 70-80 domains that would not agree to remove links. My SEO believes that we should focus on improving visitor engagement rather that on more esoteric SEO like trying to build incoming links. They think that improving useability will improve conversions and would generate results faster than traditional SEO. Also, they think that improving click through rates, reducing bounce rates will improve ranking by signaling to Google that the site is providing value to visitors. Does this sound like a reasonable approach? On one hand I don't see how my site with a MOZ domain authority could possibly compete against sites with a high number of quality incoming links and that maybe building a better link profile would yield faster results. On the other hand, it seems logical that Google would reward a site that creates a better user experience. Any thoughts from the MOZ community???? Does it sound like the recent loss of traffic is due to the indexing of the 175 pages? If so, when should my traffic and ranking return? Incidentally, these are the steps taken since last November to improve SEO: SEO Traffic & Ranking Drop Analysis and Recommendations (included in-depth SEO technical audit and recommendations). Unnatural Link Removal Program Content Optimization (Audit & Strategy with 20 page keyword matrix) CORE (also provided wireframe for /visitor-details pages at no-charge) SEO Copywriting for 10 pages New wire frames implemented on site on June 6th Jump in indexed pages by 175 on June 10th. Google Webmaster Tools removal request made for those low quality pages on June 23rd. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
HTML for URL markup
Hi, We are changing our URLs to be more SEO friendly. Is there any negative impact or pitfall of using <base> HTML-tag? Our developers are considering it as a possible solution for relative URLs inside HTML-markup in the Friendly URL context.
Intermediate & Advanced SEO | | theLotter0 -
Domain Alias SEO
We have 5 domain alias of our existing sites
Intermediate & Advanced SEO | | unibiz
All 5 domain alias are domain alias of our main site. It means, all domain alias will have exactly same site and contents
Like Main domain: www.mywebsite.com
DomainAlias: www.myproduct.com, www.myproduct2.com, www.myproduc3.com
And if anybody will open our site www.myproduct.com, it will open same website which I have in primary site what can i do to rank all website without any penalty....i s there any way? This is domain alias of in hosting industry Thanks0 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0 -
Multiple blogs for seo
I have signed up for some rather expensive lawyer directories that have very high domain PR, 's of 6 or 7 . Some of these allow you to make blog posts or articles on their site which should be good for SEO because of the high domain PR. I understand that if I do a lot of posts on one of these blogs with links back to my site, I should rapidly reach the point of diminishing returns because they are all coming from the same domain. Therefore, I plan to mix up my blo posts betwee several of these sites and also rewrite them and post them on my own site's blog. My question is this, if I post on any of these sites and I link back to internal pages of my site, and not to the home page, does this offset the "diminishing returns" factor? Paul
Intermediate & Advanced SEO | | diogenes0 -
In order to improve SEO with silos'urls, should i move my posts from blog directory to pages'directories ?
Now, my website is like this: myurl.com/blog/category1/mypost.html myurl.com/category1/mypage.html So I use silos urls. I'd like to improve my ranking a little bit more. Is it better to change my urls like this: myurl.com/category1/blog/mypost.html or maybe myurl.com/category1/mypost.html myurl.com/category1/mypage.html Thanks
Intermediate & Advanced SEO | | Max840