How to solve JavaScript paginated content for SEO
-
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
-
thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
-
It will be strongly de-valued and the links may or may not even be noticed / seen at all. Googlebot can leverage headless browsers (something similar to Selenium or Windmill in Python, with targeting handled via XPath maybe). The only thing is, this takes ages longer than basic source-code scraping. To scrape the modified source with a headless browser can take, 5-10 seconds instead of less than 1 second
Since Google's mission is the 'index the web', you have to fathom that they wouldn't take this colossal efficiency hit all the time, or for everyone. Certainly looking at the results of many sites and their different builds, that's exactly what I see. Just because 'Google can' that doesn't mean that 'Google will' on all crawls and websites
Some very large websites rely on such technologies, but usually they're household name sites which offer a unique value-proposition of cultural trust signals for the specified audience. If you're not a titan of industry, then you're likely not one of the favoured few who gets such special treatment from Googlebot so regularly
This is an interesting post to read:
https://medium.com/@baphemot/whats-server-side-rendering-and-do-i-need-it-cb42dc059b38
... you may also have the option of building the HTML on the server side and then serving it in different URLs to the user. To me it sounds like a case where SSR might be the best option. That way you can still use your existing technologies (which are FAST) to render the modified HTML, but render it on the server side and then serve the static HTML (after the render) to users using SSR. That's personally what I would start looking at as it will keep the best of both worlds
Implementation could be costly though!
I don't think you'd get accused of cloaking but that doesn't change the fact, part of your site's architecture will 90% become invisible to Google 90% of the time which is not really very good for SEO (at all)
Another option, instead of building all the post listings on page-load (which will cause stutter between pages), just load all of them at once in the source code and use the JavaScript to handle the visual navigation (from page to page) only. Let JS handle the visual effect, but keep all listings in the HTML right from the get-go. That can work fine too, but maybe SSR would be better for you (I don't know)
...
after looking at your source code, it seems you have already done this. The only real problem would be if the links themselves were 'created' through the JS, which they are not (they all start visible in your non-modified source code). Yes, things which begin hidden, are slightly de-valued (but not completely). This might impact you slightly, but to be honest I don't think separating them out and making the pages load entirely separately would be much better. It would help architectural internal-indexation slightly, but likely would hamper content-loading speeds significantly
Maybe think about the SSR option. You might get the best of both worlds and you might be able to keep the JS intact whilst also allowing deep-linking of paginated content (which currently is impossible, can't link to page 2 of results)
Let me know if you have previously thought about SSR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Entities and SEO
How do you find the correct words (entities) to explain an entity ? The words (entities) that go together within a sentence seem to be based on the specific corpus (the keyword you want to rank ) and when they are million of results it seems impossible to find what word / entity is going to explain the entity / concept I want to explain. It seems that I got a better chance at the lottery 🙂 a
Intermediate & Advanced SEO | | seoanalytics
Do you have any advice or software that could parse and find the words that co-occure the most often out of millions of results !! Thank you,0 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
How can I get maximum Seo juice from others embedding my content
Hi - I create virtual tours which I host and my clients embed (this site will be a holiday directory one day and linking is unlikely). What can I do with the embed code they use - most use iframes - to get maximum Seo juice? Example tour below https://bestdevonholidays.co.uk/lavender/virtualtour.html Thanks
Intermediate & Advanced SEO | | virtualdevon0 -
Website Migration and SEO
Recently I migrated three websites from www.product.com to www.brandname.com/product. Two of these site are performing as normal when it comes to SEO but one of them lost half of its traffic and dropped in rankings significantly. All pages have been properly redirected, onsite SEO is intact and optimized, and all pages are indexed by Search engines. Has anyone had experience with this type of migration that could give some input on what a possible solution could be? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | AlexVelazquez0 -
SEO friendly blog.
i've read somewhere that if you list too many links/articles on one page, google doesn't crawl all of them. In fact, Google will only crawl up to 100 links/articles or so. Is that true? If so, how do I go about creating a page or blog that will be SEO friendly and capable of being completely crawled by google?
Intermediate & Advanced SEO | | greenfoxone0 -
Ajax website and SEO
Hi all, A client of mine has a website similar to Pintrest. All in Ajax/. So imagine an ajax-grid based animal lover site called domain.com. The domain has three different Categories Cats, Dogs, Mice. When you click on a category, the site doesn't handle the URL and doesn't change the domain So instead of the domain going from domain.com to domain.com/cats, it uses the Ajax script and just shows all the cat pins. and when you click on each pin/post it opens a page such as domain.com/Pin/123/PostTitle It doesn't reference the category. However a page domain.com/cats does exist and you can go there directly. Is this an SEO issue for not grouping all pins under a category? How does Google handle Ajax these days, it use to be real bad but if Pintrest is going so well i'm assuming times have changed? Any other things to be wary of for a grid based/ajax site? I am happy to pay for an hour or two for a more in depth audit/tips if you can feed back on the above. Fairly urgent. Thanks
Intermediate & Advanced SEO | | Profero1 -
Homepage Content
I have a website which perform very well for some keywords and much less for other keywords. I would like to try to optimize the keywords with less performance. Let's say our website offers 2 main services: KEYWORD A and KEYWORD Z. KEYWORD Z is a very important keyword for us in terms of revenue. KEYWORD A gives us position Nr 1 on our local Google and redirect properly the visitors to xxxxxx.com/keyword-a/keyword-a.php KEYWORD Z perform badly and gives us position Nr 7 on local Google search. 90% Google traffic is sent to xxxxxx.com/keyword-z/keyword-z.php and the other 10% is sent to the home page of the website. The Homepage is a "soup" of all the services our company offers, some are important (KEYWORD Z) and other much less important. In order to optimize the keyword KEYWORD Z we were thinking to make a permanent redirect for xxxxxx.com/keyword-z/keyword-z.php to xxxxxx.com and optimize the content of the Homepage to ONLY describe our KEYWORD Z. I am not sure if Google gives more importance in the content of the homepage or not. Of course links on the homepage to other pages like xxxxxx.com/keyword-a/keyword-a.php will still exists. The point for us is maybe to optimize better the homepage and give more importance to the KEYWORD Z. Does it make sense or not?
Intermediate & Advanced SEO | | netbuilder0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0