Don't use an h1 and just use h2's?
-
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
-
I think that basic on page seo needs to be followed. Meaning that you should have one h1 on the page and above the fold. That will signal to google the importance of that phrase. The rest should be h2 and h3 but used sparingly. This will give google something to compare the h1 to. I know some sites make the mistake of putting all there keywords on the main page in h1, that doesn't work and hurts the site in rankings. I'm not sure what your programmers thinking is, maybe he has knowledge that I don't have which is very possible but from my experience and constant reading of seo best practices h1 implemented correctly helps.
-
I feel the same way. Thanks for the moral support!
-
To be fair, your programmer probably doesn't have a huge understanding about SEO and has probably just misunderstood something he has read somewhere.
Yes h1 tags can be over optimized and can have an effect but the simple answer is, don't over optimize the h1 tags. Just because some people abuse them isn't a good reason to exclude them. In fact, I would actively encourage you not to exclude them as they are an important part of your on-page SEO strategy.
-
I have very little to back him up.
I appreciate that h1 tags are over-optimised when they are poorly executed (e.g. keyword stuffing or using more than one instance of a h1 per page) however the same can be said for h2 tags.
If I were in your position I would look for further clarification of what he sees as over-optimized h1 tags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googles Search Intent – Plural & Singular KW’s
This is more of a ‘gripe’ than a question, but I would love to hear people’s views. Typically, when you search for a product using the singular and plural versions of the keyword Google delivers different SERPs. As an example, ‘leather handbag’ and ‘leather handbags’ return different results, but surely the search intent is exactly the same? You’d have thought Google was now clever enough to work this out. We tend to optimise our webpages for both the plural and singular variations of the KW’s, but see a mixed bag of results when analysing rankings. Is Google trying to force us to create a unique webpage for the singular version, and another unique webpage for the plural version? This would confuse the visitor, and make no sense.. the search intent is the same! How do you combat this problem? Many thanks in advance. Lee.
Algorithm Updates | | Webpresence0 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
Use of http://schema-creator.org boost ranking
Hello all if we use http://schema-creator.org for structured html will it increase our ranking too. has it any benefit for SEO?
Algorithm Updates | | adnan11010 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Any SEO thoughts about Google's new Data Highlighter for products?
After searching around on the web for a while I couldn't find any case studies or interesting posting about Google's new feature to highlight structured data. In Google Webmaster Tools you can now tag your products to be displayed as structured data in Google's search results. Two questions that rose immediately: 1. What effect will Google's new Data Hightlighter for products have on your SEO? Can we expect better CTR's for productspage results in Google? Better conversion rates perhaps? Any case studies that show KPI improvements after using structured data for products? 2. I would love to see some examples in the search results to see what productpages would look like after Data Highlighting it. Your thoughts or input about this subject will be much appreciated.
Algorithm Updates | | SDIM0 -
Search bots that use referrers?
Can someone point me to a list or just tell me specific search bots that use referrers?
Algorithm Updates | | BostonWright0 -
"Revisit-after" Metatag = Why use it?
Hi Mozfans, Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them; name="revisit-after" content="7 days" /> I'm wondering what is the purpose of the tag? Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible? Thanks in advance everyone! Ash
Algorithm Updates | | AshSEO20110 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0