Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
-
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this?
Any insight would be appreciated. Thanks!
-
Read what EGOL wrote. It depends upon the nature of your blog pagination
There are a few reasons you could have pagination within the blog area of your site
-
Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct
-
Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives
-
Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post)
In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!
-
-
This is very true for multi-section articles (which span multiple addresses), and less true of articles which have only one address yet break down into multiple addresses in terms of UGC comment-based pagination
-
I wouldn't worry about it as search bots "should" understand that these pages are part of a paginated series.
However, I would recommend you ensure that "rel=next/prev" is properly implemented (despite Google announcing that they don't support it). Once the pagination is properly implemented & understood, bots will see the pages as a continuation of a series, and therefore will not see duplicate H1s as a problem.
-
In some instances, not using unique
and unique <title>is a huge opportunity loss.</p> <p>Let's say you have a fantastic article about Widgets and you break it up over several pages. The sections of your article are:</p> <ul> <li>wooden widgets</li> <li>metal widgets</li> <li>plastic widgets</li> <li>stone widgets</li> </ul> <p>... if you make custom <h1> and <title> tags for these pages (and post them on unique URLs) you are going to get your article into a lot more SERPs and haul in a lot more traffic.</p></title>
-
Best practice is a unique H1 - only one H1 to describe a page.
-
Don't worry about it. You're not trying to rank your /blog/2 or /blog/17 for any specific terms. Those pages are pretty much for site visitors not the search engines.
As an example, Moz has the same h1 tag on all their blog pages.
All of the following URL's have "The Moz Blog" as the h1 tag:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best way - to have all FAQ pages at one place, or splitted in different sections of the website?
Hi all, We have a lot of FAQ sections on our website, splitted in different places, depending on products, technologies, etc. If we want to optimize our content for Google's Featured Snippets, Voice Search and etc. - what is the best option: to combine them all in one FAQ section? or it doesn't matter for Google that this type of content is not in one place? Thank you!
Algorithm Updates | | lgrozeva0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Does it matter? 404 v.s. 302 > Page Not Found
Hey Mozers, What are your thoughts of this situation i'm stuck in all inputs welcome 🙂 I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website. I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404) Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
Algorithm Updates | | rpaiva0 -
How Do I Optimize with Google's Video Search?
Hi everyone, I am looking here https://developers.google.com/webmasters/videosearch/schema and I don't fully understand. Could someone please explain, step by step, what I have to do to optimize for Google video search? I.e. Step 1 do this Step 2 do this. I don't fully understand Thank you!
Algorithm Updates | | jhinchcliffe0 -
Has anyone seen this before? One domain dominates the entire first page!
Do a google search for "sober college" and tell me you don't see the entire page filled with one domain. (except the last result)
Algorithm Updates | | EmarketedTeam0 -
Why is my domain authority (and page authority) plummeting?
In June our domain authority was at a 41. In July we were 38 and ever since then our domain authority is gradually getting worse and worse. We went from a 33 to a 29 in one week! Possible explanations include: Maybe the SEO we hired (for a few months in late 2011) added our domain to some less-than-awesome directories The 301 redirects on our home page are hurting us somehow Duplicate content for URL's with different capitalization (IE: /pages/aboutus and /Pages/AboutUs) Can someone please point me in the right direction? Which of the above possibilities would likely impact domain/page authority? Any other ideas as to why this might be happening? Any suggestions for improving our domain or page authority? Thanks for the help!
Algorithm Updates | | MichaelBrown550 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0 -
Was I Kicked Off Google Page One by Panda/Farmer?
Took over this site in March. Got a Panicked call from client Mid-March that all of a sudden keywords that put the site on Page One weren't working. There are still 9 that work, but apparently there were more. A large percentage of the backlinks are from Article Directories and Link Farms. Is this my problem? Also, a large percentage of the 149 pages suffer from keyword stuffing and were obviously written for Search Engines and not people. How much of a difference does that make?
Algorithm Updates | | reeljerc0