Not using H1's with keywords to simulate natural non SEO'd content?
-
There has been a lot of talk lately about making a website seem like it is not SEO'd to avoid over optimization penalties with the recent Google Algorithmic updates.
Has anyone come across the practice of not using Headings (H1's, H2's etc..) properly to simulate that the current webpage isn't over optimized?
I've come across a site that used to use multiple keywords within their headings & now they are using none. In fact they are marking their company name & logo as an H1 and non keyworded H2's such as our work or Contact.
Is anyone holding back on their old SEO tactics to not seem over optimized to Google?
Thanks!
-
Thanks!
-
In that case, I've seen a few people try it with no notable diffrence. Pre-Penguin there were a few cases here were removing several instances of a keyword in the body seemed to dramaticaly improve rankings, but thats more removing keyword stuffing then optimising your page to apear unoptimised.
Right now, if your keyword can be there and it reads naturally, then I don't see much reason for it not to be there. In contrast, if you whole page is about blue widgets ad the heading /doesn't/ include blue widgets, you'll be confusing people. People also link using the heading/title occassionaly, so you should pull off a few genuinely natural links with that heading.
At least as far as penguin goes, it seems much more link anchor oriented right now.
-
Yes thanks. My question wasn't to stop using them at all. My question was if people were not using them with keywords anymore to simulate a webpage not being overly optimized.
-
Considering having h1 as the pages main heading and using h2-h6 for sub headings is proper html (or multiple h1s and sections in html5), I'd never stop doing it in hopes of getting an SEO advantage that may or may not lost with algo updates.
Most sites at the very least have a h1 as their main heading, theres nothing over-optimised about it unless you then keyword stuff it or something like that.
Basicaly, using a h1 for your main heading isn't an SEO tatic, it's what it's actually for.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changes taken over in the SERP's: How long do I have to wait until i can rely on the (new) position?
I changed different things on a particular page (mainly reduced the exaggerated keyword density --> spammy). I made it recrawl by Google (Search Console). The new version has now already been integrated in the SERP's.Question: Are my latest changes (actual crawled page in the SERP's is now 2 days old) already reflected in the actual position in the SERP's or should I wait for some time (how long?) to evaluate the effect of my changes? Can I rely on the actual position or not?
On-Page Optimization | | Cesare.Marchetti0 -
Javascript(0) extension causing an excess of 404's
For some reason I am getting a duplicate version of my urls with /javascript(0) at the end. These are creating an abundance of 404 errors. I know I am not supposed to block JS files so what is the best way to block these? Ex: http://www.jasonfox.me/infographics/page/8/javascript(0) is a 404 http://www.jasonfox.me/infographics/page/8/ is not Thank you.
On-Page Optimization | | jasonfox.me0 -
Properly changing title, URL and content for new keywords without harming other rankings.
Hello - We are looking to try to bring up some keywords in the SERPs that we are currently ranking fairly low for. We sell Christening clothing for children and people will use both Christening and Baptism to search for the same thing. We currently rank very high for Christening (#1 on Google for certain combinations) but we are fairly low on Baptism.
On-Page Optimization | | BabyBeauBelle
I am trying to figure out the best way to start getting Baptism up by changing some title, URL and content pages to include more Baptism keywords. My concern is messing with the existing because we rank so well for Christening. Since we are ecommerce we can vary this quite a bit on our products, but again I'm nervous to do so fearing changing the wrong things, too many products etc and in the process of trying to raise one set of keywords (baptism) we harm the other set (christening).
Any advice would be appreciated!0 -
Can soft 404's hurt my rankings?
This post mainly pertains to soft 404's but I recently dropped a few ranks in my main keyword which I have maintained prior to this my better rank for well over 2 years. I participate in NO BLACKHAT and obtain links naturally. I want to describe a few issues that happened prior to my ranking dropping and see what you guys think. I started to receive about a week prior to my ranks dropping DNS issues with GWT. It was weird because when I would use Goole Fetch on those pages they would return just fine so I was not sure what was happening there. I use Google page speed server which did in fact decrease my load time (YEAH!!) so that was cool. About 1 week prior to my rankings dropping I enabled godaddy's Website Accelerator as well thinking that could help even more. Because of this I thought maybe this had something to do with my DNS issues with Google so I decided to turn off my website accelerator with Godaddy and just leave my Google pagespeed service on. I figure I don't need 2 of them anyways IMO. Also at the same time I started to receive a ton (31,000+) html errors with duplicate metadescriptions and titles. I discovered I had an error with my code which was displaying 2 different sets of descriptions and titles for each of these pages. I since then have fixed the issue and waiting for Google to index those pages. Here is were I think I might have been hurting from the drop in rankings. Some months ago (maybe 2) I decided to redirect my 404's to my homepage. Yes I know this is not good now and I have created a proper 404 page which returns the 404 code. I recently started getting a ton of Soft 404 errors in GWT which is what brought my attention to this issue. My question is, could my action of redirecting my users to my homepage as a 404 which obviously was returning a 200 on a page that did not exist be possibly the culprit to my ranks dropping?
On-Page Optimization | | cbielich1 -
Dates in URL's
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered. Eg:- www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13 http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13 Only this URL should be spidered:- http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either. Can anyone help please.
On-Page Optimization | | carbisbayhols0 -
Different pages for OS's vs 1 Page with Dynamic Content (user agent), what's the right approach?
We are creating a new homepage and the product are at different stages of development for different OS's. The value prop/messaging/some target keywords will be different for the various OS's for that reason. Question is, for SEO reasons, is it better to separate them into different pages or use 1 page and flip different content in based on the user agent?
On-Page Optimization | | JoeLin0 -
Long or Short URLs. Who's Coming to Dinner?
This has been discussed on the forums in some regard. My situation. Example 1 Long Keyword URL: www.abctown.com/keyword-for-life-helping-keywords-everywhere-rank-better Example 2 Short Keyword URL: www.abctown.com/keyword In both examples I want to improve rankings for the "keyword" phrase. My current URL is example 1. And I've landed a page one ranking in Google (7) with that URL. In attempts to improve rankings further (top 5), I was toying with the idea of going simpler with all my URLs in favor of the example 2 model. Might this method help or hurt my current rankings? In recent articles I've read it seems that going with the simpler more human approach to my SEO efforts. Any thought would be appreciated. Cheers,
On-Page Optimization | | creativedepartment0 -
Value of PDF's in SEO
I have a client who has a lot of information in PDF form. They think they should move some of it over into HTML pages so it indexes better. Is there a benefit to converting these PDF's into HTML pages? It seems to me that HTML pages would be good, IF they are relevant pages that could be used online.
On-Page Optimization | | lvstrickland0