Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
-
Hi,
My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page.
However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...)
Many thanks for your help, Luke
-
I understand. Good reminder.
-
Hi AWC - this is tangential to the topic, but important for Q+A and Moz community participation in general.
Please, in the future, work to be as generous and empathetic in replies as possible. This community is meant to be a haven from many of the nastier corners of the web and while your comment was not excessively insulting, it wasn't kind either. Contributions both big and small are welcome here, as are opinions.
If we're going to maintain the amazing community here, we have to be mindful about the impacts of negativity. Thanks for understanding.
-
I think Ryan's point about HTML5 is good to keep in mind, but the problem is that we don't have any great guidance on what Google thinks about HTML5 right now, at least at this level of detail. They're waiting for the standard to evolve into common practice, just like the rest of us. I suspect, though, that if HTML5 is changing the rules, they may scale back their judgment.
-
To be fair, how do you know that they're "spammy", "abusive", or "irrelevant"? I've seen people just use them badly - for example, for CSS styling. Is it a best practice? No. Would I do it? No. Will it have major SEO implications in 2012? Probably not.
I've seen instances where an H1 was used badly, but not in a deliberately spammy or even irrelevant way. Developers often treat tags as much more interchangeable than they should.
-
Nobody said it would tank a site nor was it asked if it would tank a site. Until the H1 goes by way of meta keywords, the use of it will have some relevance and in my opinion should be used properly.
Of 200 plus algorithm elements, there are undoubtedly plenty of others that are "not a big deal" but that doesn't mean we shouldn't use them correctly.
Whew, there sure has been a lot of time spent on something that's "not a big deal."
-
No, Google just beat the value out of the H1 to the point its on life support..
I agree. That's why having 18 of them on one of your pages probably isn't going to tank your site.
I am not advocating more than one H1 tag... just sayin' that I don't think that this is a big deal.
-
I agree. I've been in touch about the developer's work now. It's simply not good practice, yet. I've heard that Bing is more definite in its advice on H1 than Google.
-
No, Google just beat the value out of the H1 to the point its on life support.
Sorry Egol, but if the innocents had no regard for SEO they wouldn't be putting a tag on it.
-
Keep in mind that some people innocently use
tags for formatting text. These folks are building websites because they have a message to share without any regard to SEO. And some of these websites pull an enormous amount of traffic because they are built by content area experts who write with enthusiasm and verve.
I don't think that google is pulling out a stick to beat these people.
-
Hi Luke,
As you can tell, it touched a nerve. I was looking for the moz link to a thread regarding this same issue and Alan (one of the Gurus) said multiple H1's can affect engines differently and if I remember correctly he made reference to a negative response from Bing.
Until H1's achieve the lofty status of meta keywords, I will continue to treat them with some importance and approach them with best practice.
I'll listen for the rumbling coming from your direction.
Good luck.
-
Thanks for feedback AWCthreads - tis a good question - ho hum - he's just not using them right. I've had this problem with people putting in hidden tags too. They're just not taking Google, etc., into account. Almost screamed as I counted through them yesterday hee hee.
-
Thanks Jennifer. Yup, doing all that too. I'm paying him on contract and part of the prob is if he's using H1s so much it could end up in a lot of expensive re-programming. I'm gonna stamp my feet I think. I often wonder whether anyone's tested the impact of such heavy use of H1s. We need an SEO Moz testing lab ;-).
Thanks for your input too AWCthreads Some good points there...
-
Hi Jennifer, If you're going to weigh in, you've got to bring more substance than a regurgitation of Rand's posts on the value of H1 and how SEO time is best spent. When my staff runs an SEOmoz on page optimization report and gets flagged for having 2 H1's on the page (which happened several times today), I didn't say, "Worrying about the H1's on the page is not that big of a deal." Nor did I say, "Make sure the site is crawlable and all those other high priority things." I described a bit of history of the H1, its purpose and best practice considering its value in optimization which is to say 1 is best, 2 is acceptable and more than 2 is not necessary nor is is best practice. I also added that if it wasn't of some importance, Rand certainly wouldn't have it as an element in his research tools. Having 18 H1's on a page doesn't seem excessive. It is what it is, which is asinine. That rumbling coming from down the hall is not thunder from above, but me having a visit with a developer and anyone else who thinks 18 H1's is acceptable or seemingly excessive.
-
18 H1s definitely seems excessive, however in the grand scheme of things this would be a much lesser priority in my book than many other things. I mean if this is the biggest problem, then you're doing quite well. If you're wondering if the developer is doing the right things overall, that might be a different question. I just don't think that worrying about the H1s on the page is that big of a deal. I'd make sure the site is crawlable and all those other high priority things before I spent too much time on this.
-
I was doing real well until I read this: "Ive noticed the developer's used about 18 per page" Multiple H1's are one thing, but excessive, spammy, abusive, irrelevant H1's are another.
Why in the world is he even bothering with an H1 tag if he's got 18 of them? Ask him, "What are you telling the bots with your H1's - 18 different things or the same thing 18 different times?" No wonder the value of the tag has declined so much since its inception. That volume of H1's is what Cutt's is referring to in his 2009 video.
Our CMS site auto-generates a header H1 tag when enabling optimization for mCommerce. So, when I put an H1 on the page for categories and products, the page has multiple H1's. I'd like to have one but will live with 2.
-
That definitely sounds like too many H1 tags.
On my pages I have two: one for the site name and the second for the page title. The site name H1 is in my
<header>section, while the other is in mysection. I wouldn't advice using more than 1 per section.</header>
-
Thanks David, Ryan, EGOL, Nakul - really useful feedback
I think I'm erring on the side of caution really, quite simply because any risk is too much risk. I'll read up on HTML5 some more Ryan as it sounds like it's changing thing a great deal. I've noticed the developer's used about 18 per page, for all headings. which seems quite strange, and possibly incorrect even in HTML5. I mean, blog posts headings to tweet headings to... just about every heading.
-
I would look at the pages and ask myself the question: Does this page really have more then 1 "Primary" heading ? Can you do 1 primary heading and then sub-headings ? If all such options are exhausted and the only way to address the structure and layout of the page is by having multiple H1 Tags...do it. But I would do it as a last resort or when it's absolutely necessary and it makes sense from a user perspective.
-
I have multiple H1 tags on some of my pages and don't see any problem. Just telling my observations.
If this is your site and you have concerns about multiple H1s.... maybe the developer needs to know that he is being paid by the hour and you are being paid on the basis of results. So if he wants any more hours he better not be messing with your results.
-
I had a conversation about this very topic recently, here is the advice I got:
Headings get totally different treatment in HTML5, we have to throw away everything we knew about this from HTML4/XHTML.
In earlier versions of HTML we only had headings (h1 - h6), there are no other sectioning elements at all. That is why we had to be very careful about our usage of the h1 tag, and there was always controversy regarding it usage.
In HTML5 the sectioning is much more powerful. We have a whole bunch of new elements for sectioning and the algorithm used to generate the outline is far more complex, and flexible. In short, it no longer matters how many h1 tags we have on a page.
We must still adhere to a structured approach and be careful to generate the right outline (one that reflects the proper structure of the document), and this is what this theme does.
To conclude and clarify, in HTML5 it doesn't matter if there is multiple h1 tags on a page, what matters is how they are used in conjunction with the other sectioning elements, and that the outline produced represents the correct structure of the document.
-
Best practices is to only use 1 h1 tag per page. You can see a video from mat cutts here mentioning you can have more if done correctly - http://www.youtube.com/watch?v=GIn5qJKU8VM
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pointless Wordpress Tagging: Keep or unindex?
Simple as that. Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page. If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once). I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
White Hat / Black Hat SEO | | HashtagHustler0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
1 business targetting multiple local locations
When researching a new client - I just came across a site in the same field which is ranking really well for all the local towns/cities/villages in the area. Each page for each town is a duplicate only changing out the town name (which appears 13 times on the page) - all pics and videos are the same. His url structure is along the lines of: budget-business-domain.com/budget-business-area/budget-business-town/ The domain was registered in 2012 - all backlinks are internal - anchor text is the same. I think it shouldn't be working.... but it is 😞 Why is this working?
White Hat / Black Hat SEO | | agua0 -
Multiple stores for the same niche
I started developing a new niche of products in my country about 3 years ago. That's when I opened my first store. Everything went fine, until a year ago, when someone I thought was a friend secretly stole my idea and made his own competing store. I was pretty upset when I caught him and decided to make it as difficult as possible for him, so I made another 4 stores, trying to get him as low as possible in the search results. The new sites have similar products (although not 100% identical), slightly different titles, images and prices. They look different and are built on different e-commerce platforms. They are all hosted on the same server, have roughly the same backlinks, use the same Google account for Analytics, have the same support phone numbers etc etc. I wasn't thinking that I'm doing something fishy, so I didn't try to hide anything. Trouble is that those sites, after doing fine for a few months, dropped like bricks in the search results, almost to the point that they can't be found at all. At the moment, the only site that ranks relatively well is the original one and a couple of secondary pages with no importance from one of the other sites. How did this happen? Does Google have something against this practice? Did they take action by themselves when they realized that I was trying to monopolize this niche, or did my competitor report me for some kind of webspam? And more importantly, what do I do now? Do I shutdown all but my original site and 301 redirect users to it from the others? Can I report my competitor for engaging in the same practice? (He fought back and now he has 3-4 sites, some of which still rank kind of OKish, also he has no idea about web development, SEO or marketing, he just crudely copies what I do and is slowly but surely starting to do better than me).
White Hat / Black Hat SEO | | pandronic0 -
Multiple domains pointed at one site
I know things are changing and the things Google thinks are cheating searchers from finding what they are really looking for are changing too. So, I have multiple domain names that are related to my site, but not the actual site name. For instance, I have a certification program called Certified NetAnalyst that has a few domains for it... .com, .org and other derivatives like NetAnalyst. I would like to point the domains to my main company web site and not create a site just for the certification. Does Google think it is cheating to point domain names with my company branding names to my main web site? What about domain name forwarding to a specific URL, like taking the certification name domains and pointing them to the certification page instead of the main site? Wondering if one could no follow (don't know how to do that) the domain forwarding links so it is not duplicate content? Is that possible in some way? Could you put another robots.txt file with excludes in the domain forwarding url landing page so it would not be duplicate content? For the future I want all SEO "juice" to go to the main domain, but the keyword value of the domain names is valuable. I sure would be grateful if someone that has a good understanding and specific recent experience with Google policy and enforcement could offer some sage and practical advice and perhaps a case study example where Google "likes it" or on the other hand a good explanation of why I may not wish to do this! Thank You! Bill Alderson www.apalytics.com
White Hat / Black Hat SEO | | Packetman0071 -
Competitors Developing Spammy Link For My Website
Well Guys there are lot of discussions in almost all the communities, blogs, forums about Post Penguin impact. Google says that if find that you're involved in any link building activities, we may penalize you. People out there have already started their developed links. But what if our competitors would have developed those links. Initially it was okay to develop one way links, I even developed lot of quality, but deliberately, links. around 95% links are placed manually, if return to some favor or money but all links looks natural. Most of the links I developed through content only, like articles, blog comments, PR submission, etc now really skeptical about the quality (after hearing lot of talks and reading n number of posts). Now, can I also submit my competitor's websites in 1000 topic directory (obviously not in any spammy directory), would it effect that website adversely? What if I spun an existing content and submit it into 500 article directories and give backlink to competitor site from using only one anchor text (which is obviously the main keywords - highest sales generating keyword) I look forward to some experts comments.
White Hat / Black Hat SEO | | Khem_Raj70 -
Farmer Update Case Study. Please question my logic here. (Very long!)
Hi SEOmoz community! I would like to try to give a small (well...) case study of a Farmer victim and some logical conclusions of mine that you are more then welcome to shred to pieces. So, I run MANY sites ranging from low to super quality and actually have a few that have been hit by farmer but this particular site had me scratching my head as to why it was torched. Quick background: Sitei s in a very competetive niche, been around since 2004 initially as a forum site but from 2005 also a content driven site. Site is an affiliate site and has been ranking top 5 for many high-value commercial KW's and has a big long-tail of informational kw's. Limk profile is a mix between natural, good links and purchased links from various qualilty sources. Content is high quality written articles, how-to's, blog posts etc. by in-house pro writers plus UGC from a semi active forum (20-30 posts a day). Farmer: After Farmer, this site's vertical is pretty much same as before with the biggest exception being my site. I quickly discounted low-quality content (spider-food) and focused instead on technical reasons. I took this approach since this site isn't the most well kept site I have and I figured the crappy CMS + PHPBB might have caused isseus. I didn't want to waste my time crawling the site myself so I quickly downloaded all the URLs that Majestic had crawled. Too my surprise the result of Majestic's crawler was over 3 million URLs when the real number would likley be 30-40k and Google has about 20k indexed. After scanning through the file with URLs I knew I had issues. Massive amounts of auto-generated dupe pages from the forum and so on. By adding around 20 new lines to robots.txt I was able to block millions of pages from being crawled again. My logic: Ok, so now I think I've found what caused the drop. Milllions of dupe pages and empty pages could have tripped the Farmer algo update to think the site is low quality or dupe or just trying to feed the spiders with uselessness. My WEAK point in this logic is that I can't prove that Google even knew about (or smart enough to ignore them). Google WMT tells me they've crawled an average of around 10k pages the last 90 days. Given this I'm doubting my logic and if I've found the issue or not. My next step is to see if this gets resolved algorithmically or not, if not i feel I have a legitimate case to submit a reinclusion request but i'm not sure? Since I haven't been a contributing member to this community I'm not looking to get direct help with my site, but hopefully this could spark some discussion about Farmer and maybe some flaming of my logic regarding the update 🙂 So, would any of you have drawn similar conclusions as I did? (Sweet blog bro!)
White Hat / Black Hat SEO | | YesBaby0