Am I using pagination markups correctly?
-
Hey Mozzers!
I am receiving duplicate title tag errors from Search Console on paginated pages (blog.com/chlorine, blog.com/chlorine-2, blog.com/chlorine-3). I do not currently have a view all page. If I were to create one, would I add all the content from chlorine-2 and chlorine-3 to the blog.com/chlorine page? Then use the rel=cononical on chlorine-2 and chlorine-3 to blog.com/chlorine?
If I move forward without the view all page, I could implement the next/prev HTML markups but can I do this without dev help? I am currently using the Yoast SEO plugin and do not see the option. Would I use the text editor to add the markups directly before the content?
I think I have a grasp on this, but this will be my first time implementing and I want to double check first! Thanks!
-
Here is the view all page I created. Each of the anchor text links will take you into each paginated page (6 in total). All 6 of these pages have a canonical tag back to view all page.
Did I set that up correctly? Thanks for your insight!
-
Sure, please share an example URL to help me suggest you on this.
-
Sorry Nitin, I just saw this message! Would you still like to review, I can always send the URL along. Thanks!
-
Could you please share the URL here? Would like to have a look at the way you've implemented this.
-
Update
I did not initially have a view all page, the first page simply had a sentence and read more anchor text to each chlorine article /chlorine-2, /chlorine-3. I went ahead and took page one (blog.com/chlorine) and used this to create a full article. I took a paragraph, or more, from each page (blog.com/chlorine-2, blog.com/chlorine-3) with a link to the full article.
I then added rel=conanical to blog.com/chlorine-2 and blog.com/chlorine-3 back to blog.com/chlorine. I think I've gotten it all squared away. Please correct me if I'm wrong.
Thanks Moz!
-
Hi,
The best way to handle this kind of scenario is by implementing pagination wisely. Even if you don't have a view all page for your content, may be you're doing lazy loading or something similar to show the entire content to your users but bots' HTML snapshot misses all this content. Hence, its better to support pagination and expose them to bots using proper pagination.
All you need to implement is a doubly linked list kind of structure where your first page's next pointer points to second page and it doesn't have prev pointer; second page's prev pointer points to first page and its next pointer points to the 3rd page; similarly last page doesn't have next pointer but its prev pointer points to the second last page. You can refer this guide for implementing the same. This site can be referred to understand how exactly to implement this.
Yes, its really trivial to implement this, you might need a dev's help though if yoast doesn't support this. Hope this helps!
Feel free to write here if you need any further assistance on this. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you use to come up with content ideas?
Buzzsumo charge and not sure it's worth it. didn't find Quora helpful. Any others?
Technical SEO | | SwanseaMedicine2 -
Using Product Page Content from an Offline Website
Hi all, We have two websites. One of the website's no longer sells product range A. However, on the second website, we would like to sell range A. We paid a copywriter to write some really good content for these ranges and we were wondering if we would get stung for duplicate content if we took these descriptions from website 1 and placed them on website 2. The products / descriptions are live anymore and haven't been for about 6 weeks. We're ranking for some great keywords at the moment and we don't want to spoil that. Thanks in advance! D
Technical SEO | | 10dales0 -
How to use robots.txt to block areas on page?
Hi, Across the categories/product pages on out site there are archives/shipping info section and the texts are always the same. Would this be treated as duplicated content and harmful for seo? How can I alter robots.txt to tell google not to crawl those particular text Thanks for any advice!
Technical SEO | | LauraHT0 -
Should I use Event Schema for a page that reports on an event?
I have a question about using Schema data. Specifically: Should I use Event Schema for a page that reports on an event? I provide high-quality coverage (reporting) about new products being introduced at an industry trade show. For the event, I create a single page using the event name, and provide a great deal of information on how to attend the show, the best places to stay and other insider tips to help new attendees. Then during the show, I list the new products being introduced along with photos and videos. Should I use event schema data for this page, or does Google only want the event organizer to use that data? Any benefits or drawbacks to using event schema? Thanks! Richard
Technical SEO | | RichardInFlorida0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Need for a modified meta-description every page for paginated content?
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page. The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3 Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure. Any advice?
Technical SEO | | My-Favourite-Holiday-Cottages0 -
Is the RSS created using google reader automatic?
I created an RSS feed to information on my blog using the notes capability of google reader and feedburner. when I update my blog does this feed recognize the change or do I need to do a manual update of google reader?
Technical SEO | | casper4340 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0