Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
-
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this?
Any insight would be appreciated. Thanks!
-
Read what EGOL wrote. It depends upon the nature of your blog pagination
There are a few reasons you could have pagination within the blog area of your site
-
Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct
-
Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives
-
Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post)
In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!
-
-
This is very true for multi-section articles (which span multiple addresses), and less true of articles which have only one address yet break down into multiple addresses in terms of UGC comment-based pagination
-
I wouldn't worry about it as search bots "should" understand that these pages are part of a paginated series.
However, I would recommend you ensure that "rel=next/prev" is properly implemented (despite Google announcing that they don't support it). Once the pagination is properly implemented & understood, bots will see the pages as a continuation of a series, and therefore will not see duplicate H1s as a problem.
-
In some instances, not using unique
and unique <title>is a huge opportunity loss.</p> <p>Let's say you have a fantastic article about Widgets and you break it up over several pages. The sections of your article are:</p> <ul> <li>wooden widgets</li> <li>metal widgets</li> <li>plastic widgets</li> <li>stone widgets</li> </ul> <p>... if you make custom <h1> and <title> tags for these pages (and post them on unique URLs) you are going to get your article into a lot more SERPs and haul in a lot more traffic.</p></title>
-
Best practice is a unique H1 - only one H1 to describe a page.
-
Don't worry about it. You're not trying to rank your /blog/2 or /blog/17 for any specific terms. Those pages are pretty much for site visitors not the search engines.
As an example, Moz has the same h1 tag on all their blog pages.
All of the following URL's have "The Moz Blog" as the h1 tag:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Key webpage fluctuating between page 2 and page 6 of Google SERP
Hi, We have found that one of our key webpages has been fluctuating between page 2 and page 6 of Google SERP for around 2 weeks. Some days it will be on page 6 in the morning and then page 2 in the afternoon. We have recently updated some copy on the page and wondered if this could be the cause. Has anyone else experienced this? If so how long was it before the page settled? https://www.mrisoftware.com/uk/products/property-management-software/ Thanks.
Algorithm Updates | | nfrank0 -
Are Meta-descriptions important for blogs?
I am tasked with optimizing an existing sites SEO. I have added meta's to all the menu pages, however they have blog section with over 700 posts. How important are meta descriptions when it comes to a websites blog? Do I need to take the time to go through 700+ blog posts and create unique meta descriptions for each one?
Algorithm Updates | | rburnett0 -
Can 'Jump link'/'Anchor tag' urls rank in Google for keywords?
E.g. www.website.com/page/#keyword-anchor-text Where the part after the # is a section of the page you can jump to, and the title of that section is a secondary keyword you want the page to rank for?
Algorithm Updates | | rwat0 -
Dates appear before home page description in the SERPs- HUGE drop in rankings
We have been on the first page of Google for a number of years for search terms including 'SEO Agency', 'SEO Agency London' etc. A few months ago we made some changes to the design of the home page (added a blog feed), and made changes to the website sitemap. Two days ago (two months after last site changes were made) we dropped subsantially in the SERPs for all home page keywords. Where we are found, a date appears before the description in the SERPs, dating February 2012 (which is when we launched the original website). The site has been through a revamp since then, yet it still shows 2012. This has been followed by a few additional strange things, including the sitelinks that Google is choosing to show (which including author bio pages showing in homepage site links), and googling our brand name no longer brings up sitelinks in the SERPs. The problem only affects the home page. All other pages are performing as standard. When Penguin 4.0 came out we saw a noted improvement in our SERP performance, and our backlinks are good and quality, largely from PR efforts. Of course, I would be interested in additional pairs of eyes on the back links to see if anyone thinks that I have missed anything! We have 3 of our senior SEOs working on trying to figure out what is going on and how to resolve it, but I would be very interested if anyone has any thoughts?
Algorithm Updates | | GoUp3 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
How long does it take for a new website to start showing in the SERP'S
I launched my website about 6 weeks ago. It was indexed fairly quickly. But it is not showing up in the Google SERP. I did do the on page SEO and followed the best practise's for my website. I have also been checking webmaster tools and it tells me that there is no errors with my site. I also ran it through the seomoz on page seo analyzer and again no real big issues. According to seomoz I had 1 duplicate content issue with my blog posts, which i corrected. I understand it takes some time, but any ideas of how much time? And f.y.i it's a Canadian website. So it should be a lot easier to rank as well. Could my site be caught in the Google 'sandbox effect' ? Any thoughts on this would be greatly appreciated.
Algorithm Updates | | CurtCarroll0 -
Vanity URL's and http codes
We have a vanity URL that as recommended is using 301 http code, however it has been discovered the destination URL needs to be updated which creates a problem since most browsers and search engines cache 301 redirects. Is there a good way to figure out when a vanity should be a 301 vs 302/307? If all vanity URL's should use 301, what is the proper way of updating the destination URL? Is it a good rule of thumb that if the vanity URL is only going to be temporary and down the road could have a new destination URL to use 302, and all others 301? Cheers,
Algorithm Updates | | Shawn_Huber0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0