How to solve JavaScript paginated content for SEO
-
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
-
thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
-
It will be strongly de-valued and the links may or may not even be noticed / seen at all. Googlebot can leverage headless browsers (something similar to Selenium or Windmill in Python, with targeting handled via XPath maybe). The only thing is, this takes ages longer than basic source-code scraping. To scrape the modified source with a headless browser can take, 5-10 seconds instead of less than 1 second
Since Google's mission is the 'index the web', you have to fathom that they wouldn't take this colossal efficiency hit all the time, or for everyone. Certainly looking at the results of many sites and their different builds, that's exactly what I see. Just because 'Google can' that doesn't mean that 'Google will' on all crawls and websites
Some very large websites rely on such technologies, but usually they're household name sites which offer a unique value-proposition of cultural trust signals for the specified audience. If you're not a titan of industry, then you're likely not one of the favoured few who gets such special treatment from Googlebot so regularly
This is an interesting post to read:
https://medium.com/@baphemot/whats-server-side-rendering-and-do-i-need-it-cb42dc059b38
... you may also have the option of building the HTML on the server side and then serving it in different URLs to the user. To me it sounds like a case where SSR might be the best option. That way you can still use your existing technologies (which are FAST) to render the modified HTML, but render it on the server side and then serve the static HTML (after the render) to users using SSR. That's personally what I would start looking at as it will keep the best of both worlds
Implementation could be costly though!
I don't think you'd get accused of cloaking but that doesn't change the fact, part of your site's architecture will 90% become invisible to Google 90% of the time which is not really very good for SEO (at all)
Another option, instead of building all the post listings on page-load (which will cause stutter between pages), just load all of them at once in the source code and use the JavaScript to handle the visual navigation (from page to page) only. Let JS handle the visual effect, but keep all listings in the HTML right from the get-go. That can work fine too, but maybe SSR would be better for you (I don't know)
...
after looking at your source code, it seems you have already done this. The only real problem would be if the links themselves were 'created' through the JS, which they are not (they all start visible in your non-modified source code). Yes, things which begin hidden, are slightly de-valued (but not completely). This might impact you slightly, but to be honest I don't think separating them out and making the pages load entirely separately would be much better. It would help architectural internal-indexation slightly, but likely would hamper content-loading speeds significantly
Maybe think about the SSR option. You might get the best of both worlds and you might be able to keep the JS intact whilst also allowing deep-linking of paginated content (which currently is impossible, can't link to page 2 of results)
Let me know if you have previously thought about SSR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have two brands and I market one in English (BrandA.com) and one in Spanish (BrandB.com), and the websites are identical but in different languages, would that have a negative impact on SEO due to duplicate content?
I have a client who wants a website in Spanish and one in English. Typically we would use a multi-language plugin for a single site (brandA.com/en or /es), but this client markets to their Spanish-speaking constituents under a different brand. So I am wondering if we have BrandA.com in English, and the exact same content in Spanish at BrandB.com if there will be negative SEO implications and/or if it will be recognized as duplicate content by search engines?
Intermediate & Advanced SEO | | Designworks-SJ1 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
Seo App on Mobile
Hi all i am learning seo mobile app on google play and itune , I'm finding some tips or experience to seo there. Please tell me some advise .Thanks all
Intermediate & Advanced SEO | | Anhlebksp0 -
Duplicate content issue
Hi I installed a wiki and a forum to subdomains of one of my sites. The crawl report shows me duplicate content on the forum and on wiki. This will hurt the main site? Or the root domain? the site by the way is clean absolutely from errors. Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0