Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Determining When to Break a Page Into Multiple Pages?
-
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page.
Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
-
I want to address this question from a couple of perspectives....
USERS: As Dana said... Users prefer single long pages. These long pages with lots of content, lots of subtopics and lots of images are impressive when a person lands on them. That immediately shows them the depth and richness of your content and they can quickly scan your subheadings to see what you have to offer. These will more readily produce likes, tweets, links, etc. when compared to broken pages.
SEO: I have experimented with long and multiple short pages. I get more traffic from long pages because of the diversity of words that they contain. This brings in LOTS more long tail traffic. And, if visitors are liking, tweeting and linking you might get more search traffic.
MONETIZATION: This is a downside if you are showing ads. You get fewer impressions and if there is a limit on the number of ads you can display per page your ad density will be lower and thus less income. However, if your traffic is higher from the increased long tail and better rankings then you might recover the lost impressions per visitor with more visitors.
-
A few years ago there was a benefit of breaking up a document into smaller chunks - say, for every h2 (second level headings) The idea was that rather than having one big document, you could have lots of small ones to rank on all your h2's. And it seemed to work pretty well. Today, I'm finding that the content that does the best from an seo perspective is my longest content. And that the big content does way better than the sum of the parts. So, I would no longer recommend chunking up your articles, unless they're just too long to read. Some of my best articles have 2-3 thousand words. I also find a nice correlation between number of comments and my best posts. So I leave them all on the same page, making it super long. For some examples of super long content that are doing great from an seo perspective, check out the group interviews on my site (wordstream.com). Those articles have +10 minutes on the page on average and generate tons of traffic for my site. Google these for example: Ppc bid management guide, Importance of ab testing, (etc.)
-
Google did some user testing on this topic, to find out if users preferred longer pages or paginated pages. According to their research, users preferred longer pages because there is always latency when moving from one page to the next. Here's the video where a Googler cites that research: http://youtu.be/njn8uXTWiGg If you want to have it both ways, you could always break your content into pages, but put a "View All" option at the top. Personally, I am one of those folks who doesn't mind scrolling down through comments. If given the choice to continue on to a second page of comments, I probably wouldn't.
From an SEO standpoint, provided the pagination is handled properly, I don't think there's an advantage one way or the other, unless you take into consideration that your bounce rate could potentially go up with paginated pages. Even if it did though, I doubt that would significantly hurt you from an overall SEO viewpoint.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Why google indexed pages are decreasing?
Hi, my website had around 400 pages indexed but from February, i noticed a huge decrease in indexed numbers and it is continually decreasing. can anyone help me to find out the reason. where i can get solution for that? will it effect my web page ranking ?
Technical SEO | | SierraPCB0 -
SEO value of InDesign pages?
Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable?
Technical SEO | | TheaterMania1 -
Should all pagination pages be included in sitemaps
How important is it for a sitemap to include all individual urls for the paginated content. Assuming the rel next and prev tags are set up would it be ok to just have the page 1 in the sitemap ?
Technical SEO | | Saijo.George0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
Home Page .index.htm and .com Duplicate Page Content/Title
I have been whittling away at the duplicate content on my clients' sites, thanks to SEOmoz's pro report, and have been getting push back from the account manager at register.com (the site was built here and the owner doesn't want to move it). He says these are the exact same page and he can't access one to redirect to the other. Any suggestions? The SEOmoz report says there is duplicate content on both these urls: Durango Mountain Biking | Durango Mountain Resort - Cascade Village http://www.cascadevillagehotel.com/index.htm Durango Mountain Biking | Durango Mountain Resort - Cascade Village http://www.cascadevillagehotel.com/ Your help is greatly appreciated! Sheryl
Technical SEO | | TOMMarketingLtd.0