HTML5: What changes in tag optimization?
-
Can anyone shad som light on on page optimization for HTML5? Does google already taking the new section tag in consideration?
How about heading? I read somewhere that now Google can digest multiple H1 heading. Is that true and is that recomended?
Thanks a lot
-
Hi Claudio
The idea is that if you have multipule sections each completely difference topics then you can use multiple H1's
but if the topics fit under a single heading i would use a h1 once for the overriding heading, and then a h2 for each section.
Having completely different topics on the one page makes it hard to optimize and rank.
Also bing so far dont allow it. http://thatsit.com.au/seo/reports/violation/the-page-contains-multiple-h1-tags
What would be good, is if search engines would see each section as a different page if it has its own H1, and that maybe the the future, but at the moment i would try to use only one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Less relevant/not optimized competitor sites ranking higher in SERPs?
Has anyone else noticed their rank positions falling to competitor sites that aren't optimized and are less relevant? I've noticed that we've lost some rankings or have dropped over the past few weeks and the competitor pages that have replaced us haven't been optimized, aren't as relevant, and it doesn't look like there has been any updates (looking through archived versions). For example, their main "shoes" gallery is ranking for more specific shoe types, like "sandals", and "sandals" isn't even mentioned in their metadata and they have no on-page copy. Their DA is slightly higher, but our sites have a denser link profile (although, yes, I do need to go through and see what kind of links, exactly, we've gained). Has anyone else seen this happen recently, or have any ideas of why or what we could do to get our rank positions back? My main initiatives have been to create and implement fresh on-page copy, metadata, and manage 404s/301 redirects, but I'm thinking this issue is beyond a quick copywriting tweak.
Algorithm Updates | | WWWSEO0 -
Help Me Change My Client's Mind
My client wants to build a second site to provide targeted links for SEO to his main site. He's interested in buying a TLD with some near topic authority/links and then build the second site's authority up from there. He is clear that this could get him in trouble for a link scheme, but thinks it can all be hidden from Google. Off the top of my head I was able to recall a few of the pain-in-the-neck things you'd have to do to not get caught, but he seemed unconvinced. I recall you'd have to have: Different registrar Different contact/WhoIs Different site host Different G/A, GWT Logging into second's site's G/A, GWT with different IP address not used for main domain With the exception of the last one, he didn't seem to think it would be too hard. Aren't there more difficult maneuvers required for hiding this from Google? I want to be able to point out to him how ridiculous this low integrity effort will be, without losing the client. Thanks! Best... Darcy
Algorithm Updates | | 945010 -
Googlebot soon to be executing javascript - Should I change my robots.txt?
This question came to mind as I was pursuing an unrelated issue and reviewing a site's robots/txt file. Currently this is a line item in the file: Disallow: https://* According to a recent post in the Google Webmasters Central Blog: [http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better") Googlebot is getting much closer to being able to properly render javascript. Pardon some ignorance on my part because I am not a developer, but wouldn't this require Googlebot be able to execute javascript? If so, I am concerned that disallowing Googlebot from the https:// versions of our pages could interfere with crawling and indexation because as soon as an end-user clicks the "checkout" button on our view cart page, everything on the site flips to https:// - If this were disallowed then would Googlebot stop crawling at that point and simply leave because all pages were now https:// ??? Or am I just waaayyyy over thinking it?...wouldn't be the first time! Thanks all! [](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better")
Algorithm Updates | | danatanseo0 -
How to Change Geo Target Location of Country Specific Domain
Hi - I have a country specific domain (www.updater.in), used it for writing blog articles Now when i go to site settings in Webmaster - the Geo target by default is coming for India, and no option of changing geographic target. Is there any way to let Search Engines know (despite .in domain) that site Geo Location is not country specific, but is meant for users from all across !!
Algorithm Updates | | Modi0 -
Google decreased use of Meta Descripiton Tag
Over the past month or so I have noticed that Google is not using the meta description for my pages but is instead pulling text from the actual page to show on the SERP. Is Google placing less emphasis on meta descriptions?
Algorithm Updates | | PerriCline0 -
What is the most optimal URL structure
A colleague and I are discussing the most optimal URL structure for both search engines and users. Our first disagreement comes in terms of files. So for instance if I have a small site, www.abc.com, with a service landing page and 3 specific services, which structure is preferred? www.abc.com/services/service1 www.abc.com/service1 The second issue is in terms of breaking up words in the URL. Should you use hyphens or not? Using the first example, which is preferred? www.abc.com/services/home-remodeling www.abc.com/services/homeremodeling. I'm also looking for articles/case studies that support either side. Thank you in advance for your help!
Algorithm Updates | | TheOceanAgency0 -
Recent changes to suggested search algorithm?
Our company recently had a "company name + scam" listing as #2 in suggested search, and yesterday, it miraculously disappeared. Has anyone else noticed similar changes in suggested search results? I hope it stick, I'm just trying to understand exactly what caused that 1 listing to vanish.
Algorithm Updates | | CareerBliss0 -
Changing Wordpress Permalink Structure, 301s, and Possibility of Rank Loss?
I have to change the permalink structure in wordpress, as using /%postname%/ in conjunction with a couple thousand pages triggers verbose rewrite rules, which further triggers about 5,000 requests per page load. The permalink structure must change as wordpress development probably won't change this in the near future. Now, changing the permalink structure worries me quite a bit, as about 25% of my traffic is attributed to my blog posts -- the rest is covered through CMS-like-use of pages (75%). blog posts will change permalink/url structure, pages won't The website is very respected in my niche and has quite a few links going to most of my posts and pages, as well as the homepage I've noticed in the last year that anything I post starts ranking on page 1 of Google for very competitive kws in 1-3 days, often with top 3 rankings PR4 / decent Alexa / Moz ranks not too shabby either / quality content / decent social media linking (mainly Facebook) / no penalties I provided the factors as to not gloat, but rather to get the best answer from those who have fairly established websites and perhaps had to change their URLs and noticed some or no changes to their rankings. How long of a hit am I going to take / how much my posts might drop down in SERPs if I change the permalink structure, properly 301 them, and implement all changes in one swoop? Info for WordPress users Benefits of changing the permalink structure to /%post_id%/%postname%/ -- for example -- include: way faster load times, not having 5,000 requests per page load, avoiding verbose rewrite rules trigger, finally modify the site without worrying about crashing the website and using a local server to make changes on thousands of pages (the database backups, the ritual of changing the settings in the local database, changing the post/page, saving the local database, loading the locally saved db on live server, and crossing fingers and pray it works -- just takes so darn long.) Ahh..yes, huge time saver. ** this issue occurs when using WP as a CMS with several hundred pages + and using the /%postname%/ or /%category%//%postname%/ or /somethingstatic/%postname%/ -- IF USING the date based way /%year%/%postname%/ or /%post_id%/%postname%/ you should be fine.
Algorithm Updates | | pepsimoz0