H2's are already ranking well. Should I rock the boat?
-
I recently began work for a company and discovered that they are not using h1's (using h2's) and rank in the top 5 for ~90% of their keywords.
The site is one of the original players in their industry, has massive amounts of domain authority and tens of thousands of linking root domains. However, they are currently being beaten on some of their top keywords by a few of their younger competitors.
Moving their current h2 text into h1 tags could be helpful. But to what extent? Since they already rank well for so many competitive keywords, Is it worth it to rock the boat by moving their h2 text into h1 tags and risk affecting their current rankings?
-
Thanks for taking the time to answer my question Claudio.
-
Thanks for taking the time to detail your explanation, Nakul. Your method is a good one for testing. Cheers!
-
Thanks for your feedback and experience Mark. I appreciate it.
-
I really appreciate your encouragement, Brad. Your experience provides me with some hope of further boosting organic rankings and besting the competition. Thanks for sharing.
-
I've been doing SEO for a decent amount of time. You would think I would remember "when in doubt, test it out."
It's funny how when you're in the thick of things that the most obvious of answers can elude you.This is why I love the Moz community. Thanks for the reminder Ade!
-
I agree with all of the others especially about testing on a few keywords that are not mission critical.
My experience and past sites I have optimized are not affected much by H2 tags. The main factors that really made a difference were the title tags, H1, page interlinking using anchor text (not over-optimized - vary it).
There are 200+ (from the last info I heard) ranking factors. Obviously having all of these work together in synergy is best.
The good news you already have domain authority etc. I would think that it definitely won't hurt and may even help slightly.
-
Hi Collin,
I recently did this for a client site of mine (alas it didn't have a huge amount of domain authority). The results were very pleasing, we noticed a jump in keywords simply from switching from H2 to H1.
My advice - definitely go ahead with testing like the others have suggested. You've got nothing to lose!
Thanks,
Brad
-
Dear Collin,
I agree 100% with my mates, but I want to add some of my experience, for years I was doing this :
1. the exactly content of your title tag (I assume you're using your most important keywords) should be in the H! tag and this H1 tag should be as close to tag as is possible (prominence) you can use only one H1 tag on the page.
2. The use the H2 tag is optional and you can use it two or more times (no bother), usually I use two H2 tags with keywords related to my H1 (main keyword).
In the past months I was feeling the H2 is not important as in the past, and google now disagree with "over optimization" thats mean "pages with perfect keywords distribution, prominence and density".....
So you should use your primary keyword in the title tag and H1 tag and optionally H2 tags, the key is original and useful content for visitors.
Hope this help
Claudio
-
I agree with Ade. Just test it on a small scale and see what the results look like. I would suggest you try a couple different options.
1. Keywords Currently in Positions 2-5
2. Keywords Currently in Positions 6-10
3. Keywords Currently on Page 2
Find 5 in each category, make the changes and watch and see where the most impact was and make sure you are not directly impacting the test with any other items of influence (as much as you can control) to get an accurate read on the test.
-
Hi Colin,
Why don't you take it slowly and carry out some testing?
I would choose a few pages where they are ranking well for some (not so important) keywords and switch these titles to H1's, leave it for a few weeks and see how it goes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
Don't reach to make our site back in rankings
My URL is: http://tinyurl.com/nslu78 Hi, I really hope someone can help because my site seems to be penalized since last year now. Because we were not SEO experts but doctors and wanted to do things in a white hat way so we have given our SEO strategy (on-site and off-site) to the best US SEO agencies and now we are penalized. We was ranking on the 1st page with 15 keywords and now we don't even are in the first 10 pages. I know that our sector is suspicious but we are a real laboratory and our site is 100% transparent. I understand that a lot of people can think that we are all the same but this is not true, we are not a virtual company that don't even show their name or address, we show name, address, phone number, fax, email, chat service, VAT number everything so please help us. We have spent 3 months analysing every paragraph of google guidelines to see if we were violating some rule such as hidden text, link schemes, redirections, keyword stuffing, maleware, duplicate content etc.. and found nothing except little things but maybe we are not good enough to find the problem. In 3 months we have passed from 85 toxic links to 24 and from 750 suspicious links to 300. we have emailed, and call all the webmasters of each site several times to try to delete as many links as possible.We have sent to google a big excel with all our results and attempts to delete those badlinks. We have then sent a reconsideration request explaining all the things that we have verified on-site and off-site but it seems that it didn't worked because we are still penalized. I really hope someone can see where the problem is.
Technical SEO | | andromedical
thank you0 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
Why is the report telling I have duplicate content for 'www' and No subdomain?
i am getting duplicate content for most of my pages. when i look into in your reports the 'www' and 'no subdomian' are the culprit. How can I resolve this as the www.domain.com/page and domain.com/page are the same page
Technical SEO | | cpisano0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
Switching Site to a Domain Name that's in Use
I'm comfortable with the steps of moving a site to a new domain name as recommended by Google. However, in this case, the domain name I'm asked to move to is not really "new" ... meaning it's currently hosting a website and has been for a long time. So my question is, do I do this in steps and take the old website down first in order to "free up" the domain name in they eyes of search engines to avoid large numbers of 404s and then (in step 2) switch to the "new" domain in a few months? Thanks.
Technical SEO | | R2iSEO0