Optimization for pages with lists of data
-
I am looking for some ideas on what best practices are for pages that contain lists similar to this page:
http://www.backcountrysecrets.com/outdoor-sport/15/places-to-swim-and-swimming-holes.aspx
Is it better to break up the list into seperate pages of 25 listings or keep everything on the same page?
-
The general rule is to have no more than 100 links per page, so you can probably increase this page from an SEO stand-point and still be OK.
However, you also want to consider user-experience. Will the page look more tidy or be more functional for the visitor at 25 listings per page or at 50 or a 100?
We recently had a similar issue on a site I was working on at http://babynamesdiary.com/baby-names-and-meanings/ - which uses databases for baby name ideas and we opted to go with 25 names per page, but also included a list of name origins at the bottom (which is like category pages) to help both the visitors to the site and also help our SEO.
What I find is that the best method is a balance of visitor experience and good SEO (but always place visitor experience first and the rest should naturally follow).
- If you like this answer, please give us a thumbs up!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Google Data Highlighter
Hello Mozers! Anyone out there have any experience using the Google data highlighter tool in WMT? I'm just curious if it is something I should be utilizing or should just try to get the same results using microdata markup. I'm not a fan of "tools" per se...I'd rather get my hands dirty. Just looking for any thoughts or experiences in using it.
On-Page Optimization | | adamxj21 -
How do I create multiple page URLs that are optimized for location and keywords that may be overlapping or the same?
Hi guys, I am attempting to create unique URLs for several different pages on a website. Let's say hypothetically that this is a website for a chain of Ice Cream Shops in Missouri. Let's say they have 15 locations in Springfield, Missouri. I would ideally like to optimize our Ice Cream Shop's in Springfield, Missouri with the main keyword (ice cream) but also the geo-specific location (Springfield), but we obviously can't have duplicate URLs for these 15 locations. We also have several secondary keywords, think things like: frozen yogurt or waffle cone that we can also use, although it would most likely be more powerful if we use the primary keyword. Any suggestions for how to go about doing this most effectively? Thanks!
On-Page Optimization | | GreenStone0 -
Is it better to have an hreflang go to the home page in a different language if there's no corresponding page
If I have some pages in English, but not in Spanish on my website: Should my hreflang go to the home page on the Spanish site? Or should I not have an "es-MX" hreflang for that page? Ideally I would have all the pages translated, but this has not all been done yet.
On-Page Optimization | | RoxBrock0 -
Why is my contact us page ranking higher than my home page?
Hello, It doesn't matter what keyword I put into Google (when I'm not signed in and have cleaned down my browsing history) the contact us page ranks higher than the home page. I'm not sure why this is, the home page has a higher page authority, more links and more social media shares, the website is an established one. When I have checked Google Analytics my home page gets more people landing on it than the contact us page. It looks like people are ignoring the contact us page and scrolling down until they find the home page. I'd appreciate any help or advice you might have. Thank you.
On-Page Optimization | | mblsolutions2 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Removing OLD pages
Dear all, I was removing tons of old pages from my directory (about 400 pages), I was setingup a 404 custom page, all is fine, so when I go to an existing page I get a 404 and redirected to my 404 page. The problem is Google Webmaster tools list all these pages as 404, and never clean my list (1 year til now), so I assume something is wrong. Question what is the best way or natural to remove old pages from one directory? Note: previously I tryed add on these pages the NOINDEX/NOFOLLOW meta tag and I got from google Soft-404. Thank you
On-Page Optimization | | SharewarePros0 -
On-page keyword usage
SEOMOZ gave me all zeros for keyword usage. Why? The site is www.grass2greens.com and the keywords are "Asheville Landscaping Edible." The site includes these words in the title page and throughout the body text. I am not really sure, but maybe one cause for these low keyword usage ratings might be redirects or some meta tag issues, but I am really not sure. Any ideas?
On-Page Optimization | | dcaudio0