Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
-
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this?
Any insight would be appreciated. Thanks!
-
Read what EGOL wrote. It depends upon the nature of your blog pagination
There are a few reasons you could have pagination within the blog area of your site
-
Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct
-
Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives
-
Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post)
In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!
-
-
This is very true for multi-section articles (which span multiple addresses), and less true of articles which have only one address yet break down into multiple addresses in terms of UGC comment-based pagination
-
I wouldn't worry about it as search bots "should" understand that these pages are part of a paginated series.
However, I would recommend you ensure that "rel=next/prev" is properly implemented (despite Google announcing that they don't support it). Once the pagination is properly implemented & understood, bots will see the pages as a continuation of a series, and therefore will not see duplicate H1s as a problem.
-
In some instances, not using unique
and unique <title>is a huge opportunity loss.</p> <p>Let's say you have a fantastic article about Widgets and you break it up over several pages. The sections of your article are:</p> <ul> <li>wooden widgets</li> <li>metal widgets</li> <li>plastic widgets</li> <li>stone widgets</li> </ul> <p>... if you make custom <h1> and <title> tags for these pages (and post them on unique URLs) you are going to get your article into a lot more SERPs and haul in a lot more traffic.</p></title>
-
Best practice is a unique H1 - only one H1 to describe a page.
-
Don't worry about it. You're not trying to rank your /blog/2 or /blog/17 for any specific terms. Those pages are pretty much for site visitors not the search engines.
As an example, Moz has the same h1 tag on all their blog pages.
All of the following URL's have "The Moz Blog" as the h1 tag:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are Meta-descriptions important for blogs?
I am tasked with optimizing an existing sites SEO. I have added meta's to all the menu pages, however they have blog section with over 700 posts. How important are meta descriptions when it comes to a websites blog? Do I need to take the time to go through 700+ blog posts and create unique meta descriptions for each one?
Algorithm Updates | | rburnett0 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
What do media queries have to do with the page layout update?
Who thinks the lack of media queries will have an impact on whether the page layout update affects a site?
Algorithm Updates | | kimmiedawn0 -
Our company is mentioned on some high-traffic, authoritative sites and some of our products are linked as well. If we link to those pages, does it affect our SEO? How can we take advantage of those mentions?
I heard that if you link to another site, when Google indexes your site, they crawl that page that is referenced. By whatever metrics they use, if that site has your name or a link to your site, Google would rank it higher. I am not sure how true that is, but what value does another site mentioned our site have on our SEO?
Algorithm Updates | | JonathonOhayon1 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Google +1 link on Domain or Page?
Since its release, I've seen Google +1 being used across an entire domain but only reference the root href in the code snippet. At the same time, you see other sites use +1 more naturally with the button being specific to the page you're on. What's your take on this? To clarfiy, do you add: or .. on each page.
Algorithm Updates | | noeltock0 -
Google removing pages from Index for Panda effected sites?
We have several clients that we took over from other SEO firms in the last 6 months. We are seeing an odd trend. Links are disappearing from the reports. Not just the SEOmoz reports, but all the back link reports we use. Also... sites that pre Panda would show up as a citation or link, have not been showing up. Many are these are not Indexed, and are on large common Y.P or other type sites. Any one think Google is removing pages from the Index on sites based on Panda. Yours in all curiosity. PS ( we are not large enough to produce quantity data on this.)
Algorithm Updates | | MBayes0