Pagination and links per page issue.
-
Hi all,
I have a listings based website that just doesn't seem to want to pass rank to the inner pages.
See here for an example:
http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK
I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links.
The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results.
My question is:
Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)?
And would using rel canonical on page numbers greater than 1 result in better trust/ranking?
Thanks in advance.
-
I believe the number 100 is the limit of links in a page. and yes the more links the less pr being passed to each page.
But 100 links on the home page means means you can have 100 child pages with 100 links on each means 10,000 links only 2 clicks from home page.
as for re=canonical, is page 2 unique? then yes just as you would for any other page.
I assume you are aware of the flat link stucture, if not I think this page though old is a must.
http://www.webworkshop.net/pagerank.htmlIts a long read, but very imformative
-
Rel canonical doesn't tell engines not to crawl the page (the mata tag nofollow does that), but rather just tells the engine not to index the page in place of the 1st page of your pager results. This helps reduce duplicate content penalties and consolidates your PA onto a single URL. I could be wrong, but I imagine you would also prefer organic users to go to the first page of results rather than, say, page 6.
The result is that engines will still crawl your pages and find your listings (and index those), but only index the first page of your listing page.
I hope that helps to clarify things!
Andrew
-
I'm sorry, what I meant was you should make the pagination pages canonical themselves, like for Page 2...
-
But would rel canonical make listing on page 2 and above unindexible?
Listings are added chronologically on my site and I still want crawlers to be able to reach adverts created years ago, these listing a could be on page 100, surely rel canonical tells engines not to crawl the page as it is not the canonical version?
-
200 links on a page isn't that bad. Once you get to 250+ I would rethink the architecture.
Yes, you should use rel canonical on your pagination pages.
A good way to pass ranking between deep pages like this is to have a section at the bottom that offers similar listings in the area. This way you are giving the bots multiple ways to find each listing, rather than just from one page/category. Do it like this - http://www.estatesgazette.com/propertylink/advert/kensingtonrooms_hotel-_131_137_cromwell_road_london_sw7_4du-3264453.htm. They have a "More Properties from this Advertiser" section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple H2 with no direct links to content and invisible body text - is this an issue?
OK, so we've lost pagerank and I think it's because of the way our site works (and we operate it). We have a videofolio, which shows on most of our pages, showcasing our work. Over time, we have tended to unpublish these pages and created new videofolio pages to show on our home page and relevant pages. The videofolio is a set of pages, each with a title, body text and a place to insert a link to the video, which is played through a videofolio showcase on our website (www.curveball-media.co.uk). Each is set a category, e.g. film, and when the user clicks the tab for 'film', the thumbnails pop up and the user can play the video. We have to work it this way as it's the only way to remove the videos from showing on our home page and to show new content instead. Simply deselecting a category still allows the videos to be seen when the 'all' category is selected by the user. Last week, I found a way of bringing back these unpublished pages by removing the 'all' tab from the videofolio. Then I turned each one into a blog like structure instead. Essentially, instead of the video link being played through the videofolio, we deselected a category (e.g. animation, film etc) and left the page floating. The only way you can access it without being attached to a videofolio category is through the direct link. By turning off the 'all' category and deselecting the page from any other categories, we were able to properly SEO these pages. NB: If they are created for use with the videofolio, you can have only extremely limited body text and no H2, as this is the text that appears when you hover over the video thumbnail. That's just the nature of the template. What I didn't anticipate is that now the code on the home page shows all these now (un)published pages and their corresponding H2 tags. Without a category selected, there is no way to get to these pages unless I create a direct link. I plan to do this through a blog post. In the home page code, the entire videofolio page shows, including the body text and link to the video. **This text doesn't show on the home page though, i.e. the user never sees this text. ** 1. Is it an issue to have so many similar H2 tags on the homepage? 2. Is it an issue that the code has text which is essentially invisible on the home page? 3. Is it an issue that the content is not linked to through the home page visibly? Thanks!
Intermediate & Advanced SEO | | curveballmedia0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
Manage category pages and duplicate content issues
Hi everybody, I am now auditing this website www.disfracessimon.com
Intermediate & Advanced SEO | | teconsite
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion. When I was checking parts of the content in google to find duplicate content issues I found this: I google I searched: "Chaleco de streck decorado con botones" and found First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in. I was wondering if this could cause any problem related with duplicated content. Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google. Any advice? Thank you0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
SEO on page content links help
I run a website at the bottom we have scroller box which the old SEO guy used to contain all of the crap content so we can rank for keywords not on the page and put all of the links in to spread the link juice into the other inner category pages (some of these pages are only listed on our innerpages otherwise). We are trying to remove this content and add decent content above the fold with relevant long tail keywords in (it is currently decent but could do with expanding if we are removing this large chunk of text in theSEO box and some long tail keywords will be missing if we just remove it) we can add a couple of links into this new content but will struggle to list the category pages not on the left hand navigation. If we were to list all of the pages in the left hand nav would we dilute the power going to the main pages currently or would we be in the same position we are now? For example at the minute I would say the power is mainly going to the left hand nav links and then a small amount of power to the links in the SEO content if we put these into the nav will it not dilute the power to the main pages. Thank you for your time and hopefully your help.
Intermediate & Advanced SEO | | BobAnderson0 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11 -
Noindex junk pages with inbound links?
I recently came across what is to me a new SEO problem. A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that... and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.) Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...). So I'm thinking about noindexing said pages to avoid Panda filters. Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely. What would you do? One of the options I laid out above or something else? Why? p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link. p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.
Intermediate & Advanced SEO | | Gab-Goldenberg0