Pager + SEO - Is it possible?
-
Hi, I am having this issue.
I know that pager are not friends with SEO, but I want to know which is the best to do in this situations.
for example, I work in a news company, and I have a lot of news pages that are very extensive so I use a pager.
Well here I have the problem.
suppose that the url is www.mysite.com/news/id/here-comes-the-title
When you enter that URL you are viewing the first page that has this meta:
title
keywords
description
Now, the problem comes when the user goes to the page 2 of this news article.
What I shall do?
1- Change the url to www.mysite.com/news/id/here-comes-the-title-PAGE2
www.mysite.com/news/PAGE2-id/here-comes-the-title
www.mysite.com/news/id/PAGE2/here-comes-the-title
2- in the page 2,3,4,5 ... add a meta robot noindex?
In the option 2 I think that I am loosing the opportunity to index the body of my article.
Is this correct?
Thanks
-
Thanks
-
Looks good!
-
Yes, I have it, please check it again and correct me if I am wrong. Thanks
http://www.espectador.com/suenatremendo-pagina-6
Also I have the other thing
76 a 90 de 405 contenidos disponibles | total de páginas: 27 | página actual 6 |
|
|
-
On your pager anchor, I don't see the rel="next" & rel="prev" attributes:
On the link you sent above, the pager control at the bottom of the page should has the below markup, but you need to add a rel='prev' attribute to it.
_[anterior] _
Similarly the below link needs to have a rel='next' attribute added.
The title looks OK. Personally, I'd make a small adjustment and not connect the hyphen to the last word.. So I'd change this: Suena Tremendo-Pagina-6 to this: Suena Tremendo - Pagina-6
-
great!
Now I have that working. I will remove the noindex from that page.
Can you check it here http://www.espectador.com/suenatremendo-pagina-6
Thanks
-
From GWMT Blog - There's no need to mark page 2 to n of the series with noindex unless you're sure that you don't want those pages to appear in search results.
Regarding the page titles - the rel="next & rel="prev" do not solve the duplicate title issue, so you would need to alter the tiles on the subsequent pages. I would just make the page titles unique by adding something like this after the current title (assuming you were on page 2): ' Page 2' or ' Page 2 or 50'.
-
and what about the title and the meta robots?
-
I would recommend using the rel="next" and rel="prev" tags in your pager control: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
In regards to the URL, I'd just put a '/{page#} after the main article URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
SEO best practice : HTTP to HTTPS
What's the best practice to switch from an all HTTP site to an all HTTPS site ?
Technical SEO | | Crocodesign
No changes to the site structure, just a full site switch to SSL.
Right now, the site is reachable with HTTP and with HTTPS. http://crocodesign.be --> https://crocodesign.be
http://www.crocodesign.be --> https://crocodesign.be
https://www.crocodesign.be --> https://crocodesign.be CMS : Wordpress 3.9
Server type : Apache
Preferred method : .htaccess0 -
Domain name SEO
I would like to hear your opinion about which between robotics.kawasaki.com and www.kawasakirobotics.com is more effective for SEO of keyword robotics and kawasaki. We have been using kawasaki.com domain name for more than 15 years.
Technical SEO | | Iwashima0 -
SEO credit for subdomain blogs?
One of my clients is currently running a webstore through Volusions. They would like to add a blog to their website, but since Volusions doesn't currently support blogs on the same domain we would have to create a Wordpress blog and link it to a subdomain. (http://support.volusion.com/article/linking-your-blog-your-volusion-store) Using this method, will their primary website receive any SEO credit for the content being created on the blog or will it only count towards the subdomain? Thanks!
Technical SEO | | CMSSolutions980 -
Client with Very Very Bad Onsite SEO
So one of my clients has a really really bad website from the technical perspective. I am talking over 75k in violations and warnings. Granted, the tagging is done well but any other SEO violation you can think of is occurring. In any case, they are building a new website, and I am on a retainer for a couple hours a week to do some link building. I am feeling like I am not getting anywhere. What is your advice? Should I keep on keeping on or advice the client to put SEO on hold until the technical issues are resolved. I feel like all of this link building isn't having the value that it could have with a site like this.
Technical SEO | | runnerkik0 -
Entry based content and SEO
My E-commerce team is implementing functionality that allows us to display different content based on what channel and even what keyword the customers used to reach our page. This is of course a move that we believe will strengthen our conversion rates, but how will this effect our organic search listings? Do you guys have any examples of how this could affect us, and are there any technology pitfalls that we absolutely need to know about?
Technical SEO | | GEMoney_No0