Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Pagination Changes
-
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s).
Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first.
The way I see it I have one option: Show every product in each category on page 1.
I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it?
Is there anything I'm missing?
-
It's likely that they will be valued a bit less but the effects shouldn't be drastic. Even if you just had one massive page with all products on the ones at the top would likely get more juice anyway
If it's a crazy big concern, think about a custom method to sort your products
-
Thank you very much for taking the time to respond so eloquently.
If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it.
I can confirm that each product does in fact appear in the source data, so as you say, Google will crawl it which is somewhat of a relief.
Does this then mean that regardless of which page the products appear on, Google will simply ignore this factor and treat each product the same regardless?
The thing I am trying to avoid is products on page 2, 3 and so on from being valued less.
-
This is a great, technical SEO query!
What you have to understand is that whilst Google 'can' crawl JS, they often don't. They don't do it for just anyone, and even then they don't do it all of the time. Google's main mission is to 'index the web' - on that account their index of the web's pages, whilst vast - is still far from complete
Crawling JavaScript necessitates the usage of a headless browser (if you were using Python to script such a thing, you'd be using the Selenium or Windmill modules). A browser must open (even if it does so invisibly) and 'run' the JavaScript, which creates more HTML - which can then be crawled only **AFTER **the script execution
On average this takes 10x longer than basic, non-modified source code scraping. Ask your self, would Google take a 10x efficiency hit on an incomplete mission - for 'everyone' on the web? The answer is no (I see evidence of this every day across many client accounts)
Let's answer your question. If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it
If the data (code) only exists with right click, inspect element - and not in "view source" - then the data only exists in the 'modified' source code (not the base-source). In that scenario, Google would be extremely unlikely to crawl it (or always crawl it). If it's a very important page on a very important site (Coca Cola, M&S, Barclays, Santander) then Google may go further
For most of us, the best possible solution is to 'get' the data we want crawled, into the non-modified source code. This can be achieved by using JS only for the visual changes (but not the structure) or by adopting SSR (Server Side Rendering)
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
Google Penalty Checker Tool
What is the best tool to check for the google penalty, What penalty hit the website. ?
Intermediate & Advanced SEO | | Michael.Leonard0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Number of images on Google?
Hello here, In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images. I use the advanced image search, by defining my domain name for the "site or domain" field: http://www.google.com/advanced_image_search and then Google returns all the images coming from my website. Is there any way to know the actual number of images indexed? Any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau1 -
Does Google hate wordpress?
I have my categories pages set to noindex, follow. I deactivated the author and date based archives, and all the /page/2 /page/3 are noindex. Is this the right approach? I had thought about adding some text to the topic of each category page and then changing them to index. I'm using showing recent post excerpts on the homepage. Another other suggestions? I think two of my sites are in panda for no good reason. It seems like non-wordpress blogs in my industry do better than comparable wordpress sites.
Intermediate & Advanced SEO | | KateV0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
My own brand name disappeared from google?
Hi, about 20-30 hours ago my own brand name disappeared from google results (We redirected old domain to new one about a month ago) My website is: www.websiteplanet.com If you search for Website Planet in google you will not find our homepage any longer.
Intermediate & Advanced SEO | | Ouzan
Not only that the brand name disappeared but we also dropped in rankings and lost about %50 of the organic traffic we had. It's important for me to say that we have never done any sort of blackhat or even greyhat SEO, at all. I could probably come up with many ideas of why it happened but maybe one of you mozzers already experienced this and could enlighten me. Will really appreciate any kind of response/help. Thanks.0 -
How long is the google sandbox these days?
Hello, I'm putting up a new site for the first time in a while. How long is the Google Sandbox these days, and what has changed about it. Before it was 6 months to 1 year long. Thanks!
Intermediate & Advanced SEO | | BobGW0