Google Pagination Changes
-
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s).
Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first.
The way I see it I have one option: Show every product in each category on page 1.
I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it?
Is there anything I'm missing?
-
It's likely that they will be valued a bit less but the effects shouldn't be drastic. Even if you just had one massive page with all products on the ones at the top would likely get more juice anyway
If it's a crazy big concern, think about a custom method to sort your products
-
Thank you very much for taking the time to respond so eloquently.
If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it.
I can confirm that each product does in fact appear in the source data, so as you say, Google will crawl it which is somewhat of a relief.
Does this then mean that regardless of which page the products appear on, Google will simply ignore this factor and treat each product the same regardless?
The thing I am trying to avoid is products on page 2, 3 and so on from being valued less.
-
This is a great, technical SEO query!
What you have to understand is that whilst Google 'can' crawl JS, they often don't. They don't do it for just anyone, and even then they don't do it all of the time. Google's main mission is to 'index the web' - on that account their index of the web's pages, whilst vast - is still far from complete
Crawling JavaScript necessitates the usage of a headless browser (if you were using Python to script such a thing, you'd be using the Selenium or Windmill modules). A browser must open (even if it does so invisibly) and 'run' the JavaScript, which creates more HTML - which can then be crawled only **AFTER **the script execution
On average this takes 10x longer than basic, non-modified source code scraping. Ask your self, would Google take a 10x efficiency hit on an incomplete mission - for 'everyone' on the web? The answer is no (I see evidence of this every day across many client accounts)
Let's answer your question. If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it
If the data (code) only exists with right click, inspect element - and not in "view source" - then the data only exists in the 'modified' source code (not the base-source). In that scenario, Google would be extremely unlikely to crawl it (or always crawl it). If it's a very important page on a very important site (Coca Cola, M&S, Barclays, Santander) then Google may go further
For most of us, the best possible solution is to 'get' the data we want crawled, into the non-modified source code. This can be achieved by using JS only for the visual changes (but not the structure) or by adopting SSR (Server Side Rendering)
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
How to reverse declining Google rankings?
We have a long established business since 2004 and have been fortunate that having been one of the original companies in our industry, we have always enjoyed strong Google rankings. Unfortunately, these have been steadily declining over the past couple of years and a comparison of August to date against the equivalent period last year has seen a 20% drop in traffic from Google. We don't believe that it is being caused by a penalty and rather is the result of some strong players entering our market and tightening their focus which has caused us to take a dip in rankings. We are guilty of being complacent in our SEO - largely due to not knowing what to do and being scared to touch it when it was working in case we broke it! - but now it's time to fight back. We still have a strong site, good traffic levels and a strong product offering. We have knowledge of SEO and resources in house, but are not experts by any means. Our current plan is to: perform a technical site audit, fixing the issues highlighted by the Moz Pro Software put strong emphasis on our blog, writing daily about the latest news and events in our industry provide weekly content articles which are more in depth than the daily blog articles and which will be of interest to our community undertake surveys and publish infographics and statistics with the hope of being picked up in national newspapers Are there any key elements that we are missing out in this plan, or is that it in a nutshell? Any help and advice is greatly appreciated.
Intermediate & Advanced SEO | | simonukss0 -
Domain remains the same IP address is changing on same server only last 3 digits changing. Will this effect rankings
Dear All, We have taken and a product called webacelator from our hosting UKfast and our ip address is changing. UKFasts asked to point DNS to different IP in order to route the traffic through webacelator, which will enhance browsing speed. I am concerned, will this change effect our rankings. Your responses highly appreciated.
Intermediate & Advanced SEO | | tigersohelll0 -
Why Did My Google Crawls Hit A Wall?
Hello, One my the sites I work with, http://www.oransi.com, has seen a significant decrease in crawl Googlebot activity in the last 90 days. See screenshot. This decrease in crawl stats runs in conjunction with less Kb downloaded per day & an increase in how much time it took Google to download a page. The client did just go through a redesign, however that happened on 4/16/15, which was after the decrease in Googlebot activity, so that should not be the issue. Same could be said for the mobilegeddan algorithm change. Any help would be greatly appreciated. 5u1lM6B
Intermediate & Advanced SEO | | BrandLabs0 -
Recent Algo Change
I was wondering if anybody can shed some light on any recent changes to the Google algorithm in Australia. A competitor, www.manwithavan.com.au has always been number 1 for the most competitive search term in our industry "removalists melbourne". However, in the last week, they have fallen out of the the SERPS and are now (according to MOZ) ranking outside the top 50. As far as l can tell, they have a really well optimized site with good structure, great text and updated content. They are very active within social media circles and have some really good external links. Can anybody tell me why they would have been hit so badly. The reason l ask is that i want to make sure we don't make the same mistake. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | RobSchofield1 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
Will this get penalized by google?
I had a thought recently, and perhaps it is a pretty bad thought, but i don't see the flaw in it, or how google would really detect it, so please correct me where I am wrong here. Say we ran some sort of marketing campeign and through that campeign we created about 100 extra pages on our domain. A lot of these pages are heavily shared on facebook, twitter, google+ etc. These pages also have several backlinks here and there. Now this campaign is over and so these pages no longer seem relevant to us. If we were to add 301 redirects to all these pages, to three different (and unrelated) internal pages (our primary targets) would this pass all the accumulated link juice on to those three target internal pages? Or would this behaviour get penalized by google?
Intermediate & Advanced SEO | | adriandg0