Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
-
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this?
Any insight would be appreciated. Thanks!
-
Read what EGOL wrote. It depends upon the nature of your blog pagination
There are a few reasons you could have pagination within the blog area of your site
-
Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct
-
Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives
-
Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post)
In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!
-
-
This is very true for multi-section articles (which span multiple addresses), and less true of articles which have only one address yet break down into multiple addresses in terms of UGC comment-based pagination
-
I wouldn't worry about it as search bots "should" understand that these pages are part of a paginated series.
However, I would recommend you ensure that "rel=next/prev" is properly implemented (despite Google announcing that they don't support it). Once the pagination is properly implemented & understood, bots will see the pages as a continuation of a series, and therefore will not see duplicate H1s as a problem.
-
In some instances, not using unique
and unique <title>is a huge opportunity loss.</p> <p>Let's say you have a fantastic article about Widgets and you break it up over several pages. The sections of your article are:</p> <ul> <li>wooden widgets</li> <li>metal widgets</li> <li>plastic widgets</li> <li>stone widgets</li> </ul> <p>... if you make custom <h1> and <title> tags for these pages (and post them on unique URLs) you are going to get your article into a lot more SERPs and haul in a lot more traffic.</p></title>
-
Best practice is a unique H1 - only one H1 to describe a page.
-
Don't worry about it. You're not trying to rank your /blog/2 or /blog/17 for any specific terms. Those pages are pretty much for site visitors not the search engines.
As an example, Moz has the same h1 tag on all their blog pages.
All of the following URL's have "The Moz Blog" as the h1 tag:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
Do you think profanity in the content can harm a site's rankings?
In my early 20's I authored an ebook that provides men with natural ways to improve their ahem... "bedroom performance". I'm now in my mid 30s, and while it's not such an enthralling topic, the thing makes me 80 or so bucks a day on good days, and it actually works. I update the blog from time to time and build links to it on occasion from good sources. I've carried my SEO knowledge to a more "reputable" business, but this project is still interesting to me, because it's fully mine. I am more interested in getting it to rank and convert than anything, but following the same techniques that are working to grow the other business, this one continues to tank. Disavow bad links, prune thin content.. no difference. However, one thing I just noticed now are my search queries in the reports. When I first started blogging on this, I was real loose with my tongue, and spoke quite frankly (and dirty to various degrees). I'm much more refined and professional in how I write now. However, the queries I'm ranking for... a lot of d words, c words (in the sex sense)... sounds almost pornographic. Think Google may be seeing this, and putting me lower in rankings or in some sort of lower level category because of it? Heard anything about google penalizing for profanity? I guess in this time of authority and trust, that can hurt both of those... but I wonder if anyone's heard any actual confirmation of this or has any experience with this? Thanks!
Algorithm Updates | | DavidCapital0 -
Are Meta-descriptions important for blogs?
I am tasked with optimizing an existing sites SEO. I have added meta's to all the menu pages, however they have blog section with over 700 posts. How important are meta descriptions when it comes to a websites blog? Do I need to take the time to go through 700+ blog posts and create unique meta descriptions for each one?
Algorithm Updates | | rburnett0 -
Product pages - should the meta description match our product description?
Hi, I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content? Thanks in advance.
Algorithm Updates | | markjoyce1 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130