How to solve JavaScript paginated content for SEO
-
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
-
thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
-
It will be strongly de-valued and the links may or may not even be noticed / seen at all. Googlebot can leverage headless browsers (something similar to Selenium or Windmill in Python, with targeting handled via XPath maybe). The only thing is, this takes ages longer than basic source-code scraping. To scrape the modified source with a headless browser can take, 5-10 seconds instead of less than 1 second
Since Google's mission is the 'index the web', you have to fathom that they wouldn't take this colossal efficiency hit all the time, or for everyone. Certainly looking at the results of many sites and their different builds, that's exactly what I see. Just because 'Google can' that doesn't mean that 'Google will' on all crawls and websites
Some very large websites rely on such technologies, but usually they're household name sites which offer a unique value-proposition of cultural trust signals for the specified audience. If you're not a titan of industry, then you're likely not one of the favoured few who gets such special treatment from Googlebot so regularly
This is an interesting post to read:
https://medium.com/@baphemot/whats-server-side-rendering-and-do-i-need-it-cb42dc059b38
... you may also have the option of building the HTML on the server side and then serving it in different URLs to the user. To me it sounds like a case where SSR might be the best option. That way you can still use your existing technologies (which are FAST) to render the modified HTML, but render it on the server side and then serve the static HTML (after the render) to users using SSR. That's personally what I would start looking at as it will keep the best of both worlds
Implementation could be costly though!
I don't think you'd get accused of cloaking but that doesn't change the fact, part of your site's architecture will 90% become invisible to Google 90% of the time which is not really very good for SEO (at all)
Another option, instead of building all the post listings on page-load (which will cause stutter between pages), just load all of them at once in the source code and use the JavaScript to handle the visual navigation (from page to page) only. Let JS handle the visual effect, but keep all listings in the HTML right from the get-go. That can work fine too, but maybe SSR would be better for you (I don't know)
...
after looking at your source code, it seems you have already done this. The only real problem would be if the links themselves were 'created' through the JS, which they are not (they all start visible in your non-modified source code). Yes, things which begin hidden, are slightly de-valued (but not completely). This might impact you slightly, but to be honest I don't think separating them out and making the pages load entirely separately would be much better. It would help architectural internal-indexation slightly, but likely would hamper content-loading speeds significantly
Maybe think about the SSR option. You might get the best of both worlds and you might be able to keep the JS intact whilst also allowing deep-linking of paginated content (which currently is impossible, can't link to page 2 of results)
Let me know if you have previously thought about SSR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WordPress Themes and SEO
I am helping out a client with updating their website.
Intermediate & Advanced SEO | | cangelmarketer
The theme they currently have hasn't been updated in ages (I am going to guess years). Would there be a difference in updating to the most recent version of their theme and changing them to a completely different theme? Or because they update in the current theme is so large anyway, it won't make a difference in terms of SEO. The reason I ask is that they don't know their Themeforest details to log in and download the most recent version of the theme, so they would have to re-purchase it, and with the hosting, they have access to a range of themes includes in their package. Thanks0 -
What is the best structure for paginating comment structures on pages to preserve the maximum SEO juice?
You have a full webpage with a great amount of content, images & media. This is a social blogging site where other members can leave their comments and reactions to the article. Over time there are say 1000 comments on this page. So we set the canonical URL, and use Rel (Prev & Next) to tell the bots that the next subsequent block of 100 comments is attributed to the primary URL. Or... We allow the newest 10 comments to exist on the primary URL, with a "see all" comments link that refers to a new URL, and that is where the rest of the comments are paginated. Which option does the community feel would be most appropriate and would adhere to the best practices for managing this type of dynamic comment growth? Thanks
Intermediate & Advanced SEO | | HoloGuy0 -
Help FORUM ( User generated content ) SEO best practices
Hello Moz folks ! For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC. I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated. Best, Yan
Intermediate & Advanced SEO | | ydesjardins2000 -
Video SEO for Google
I was wondering what the prime factors were to make something rank for a video on Google. Does anyone have any suggestions? I think that length may be important, but I don't know what the ideal run time is. Hypothetically for local SEO, would I be better off doing a tag like "Mercedes Buffalo NY" or do individual tags of "Mercedes" and "Buffalo" Thanks!
Intermediate & Advanced SEO | | oomdomarketing0 -
Website Migration and SEO
Recently I migrated three websites from www.product.com to www.brandname.com/product. Two of these site are performing as normal when it comes to SEO but one of them lost half of its traffic and dropped in rankings significantly. All pages have been properly redirected, onsite SEO is intact and optimized, and all pages are indexed by Search engines. Has anyone had experience with this type of migration that could give some input on what a possible solution could be? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | AlexVelazquez0 -
SEO Tools
Anyone have any experience and thoughts about the woo rank website and seo tool?
Intermediate & Advanced SEO | | casper4341 -
Internal Javascript Links
Hi, We have a client who has internal links pointing to some relatively new pages that we asked them to implement. The problem is that instead of using standard HTML links, their developers have used javascript - e.g. javascript:GoTo... The new pages have links from the homepage (among others) and have been live for about 3-4 weeks now - yet are still to be indexed by Google, Bing & Yahoo. Is it possibe that Javascript links are making them difficult to be found? Thanks in advance for any tips.
Intermediate & Advanced SEO | | jasarrow0 -
Can Javascript be SEO friendly?
Is some Javascript SEO friendly? I know that Google Webmaster Guidelines states you should avoid the use of Javascript, (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769), but does any one know if Google can read some Javascript or generally not?
Intermediate & Advanced SEO | | nicole.healthline0