HTML5 Nav Tag Issue - Be Aware
-
In checking my internal links with GWT, it is apparent that links within the nav tag in HTML5 are discounted by Google as "internal links"
This could have major repercussions for designing your internal link structure for SEO purposes.
I was surprised to see this result, as I have never seen it discussed.
Anyone else notice this, or have any alternative views?
-
Two weeks is pretty short time for a new site to get accurate reports from GWT. The back links I found weren't valuable - none with a page authority over 1.
I would secure at least one high quality link and wait a few more weeks.
-
Appears I broke the site... sorry
-
_Could we see the site? _
How long ago did you post the nav element?
The nav bar at top of page has been there since the site went live about two weeks ago. My GWT show only 7 internal links to my home page, but there are 15 pages published
-
Here is how one could test this to be sure:
- Create site on a throw away domain that includes:
- home page
- sub page (containing unique text in title and body)
- orphaned sub page
- Place the nav tag on all pages with links to only the first two pages.
- Add some dummy content but don't create any other links.
- Link to the orphaned page from a decently trusted and ranked page on another site.
- Wait 2-4 weeks.
- Search for the unique string and write a YouMOZ post about your findings.
-
While I have found it does, you could always use a logo link to accomplish this.
-
To be sure I understand; you have a site-wide header ,
<nav>section but you are not seeing the backlinks from all the pages in the GWT internal links report?
(Incidentally, my experience has shown these links do count.)
Could we see the site?
How long ago did you post the nav element?
</nav>
-
To be clear, I believe it is good SEO practice to ensure that every page of a website contains a link to the Home Page (and other key landing pages as befits the site).
Putting a link to the home page WITHIN a nav tag in HTML5 does not accomplish this goal.
-
"I presume your issue is you have external links inside a
<nav>container?"
No - that is not my issue. I have 5 "landing pages" (Home and 2nd tier pages) included in the main nav bar include below my site logo on every page.
I had assumed (incorrectly) that those pages would be internally linked to every page of the website - but they are NOT (at least as far as the internal links shown on GWT)
</nav>
-
This seems reasonable and a good way to ensure the link is allocated correctly.
I presume your issue is you have external links inside a
<nav>container?
Follow up: it appears the specifications do suggest the nav element is for internal links - the element is primarily intended for sections that consist of major navigation blocks. External links are generally not considered major navigation, no?
http://www.whatwg.org/specs/web-apps/current-work/multipage/sections.html#the-nav-element
</nav>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I disable the indexing of tags in Wordpress?
Hi, I have a client that is publishing 7 or 8 news articles and posts each month. I am optimising selected posts and I have found that they have been adding a lot of tags (almost like using hashtags) . There are currently 29 posts but already 55 tags, each of which has its own archive page, and all of which are added to the site map to be indexed (https://sykeshome.europe.sykes.com/sitemap_index.xml). I came across an article (https://crunchify.com/better-dont-use-wordpress-tags/) that suggested that tags add no value to SEO ranking, and as a consequence Wordpress tags should not be indexed or included in the sitemap. I haven't been able to find much more reliable information on this topic, so my question is - should I get rid of the tags from this website and make the focus pages, posts and categories (redirecting existing tag pages back to the site home page)? It is a relatively new websites and I am conscious of the fact that category and tag archive pages already substantially outnumber actual content pages (posts and news) - I guess this isn't optimal. I'd appreciate any advice. Thanks wMfojBf
Intermediate & Advanced SEO | | JCN-SBWD0 -
SEO issues with masking blog domain?
We have a client who would like to move their Wordpress blog into a different server from their main site's server for security reasons. However, the blog is almost 10 years old with good traffic and rankings and we'd rather not have them change the domain. The developer has come back with a URL "masking" rule in .htaccess that will display the contents of the blog placed in the new server under a subdomain but still show the blog's original URL. If we block the new subdomain from indexing to avoid duplicate content - are there any SEO implications for doing this? Will Google see it as a deceptive practice and tank the blog's rankings? Any advice is greatly appreciated.
Intermediate & Advanced SEO | | roundabout0 -
How can i solve this redirect chain issue?
Hey, can somebody explain to me how can I solve this redirect chain issue? I tried redirecting /contact-us directly to **/contact/ **but this causes a chain of redirects of infinite dimensions. I don't know how to find the old redirect and delete it. Is there such an option? Thank you! byA7l
Intermediate & Advanced SEO | | Popos1 -
Schema.org mark up to avoid duplicate issue?
Hey there, I was wondering, does product's mark-up help to avoid penalization due to duplicate content? Here is the example: one of my client doesn't supply unique content. Because the major part of the content is technical description of products made by a couple of manufactures, do you think it will help me to link the official manufacturer webpage in a schena.org product mark-up? I know this is the right procedure to add mark-ups, but as on the pages of my client an outbound-link will show up, so I want to tell him this will be the only way to have that duplicate content without incurring in penalisation. I'd like to give him more than one solution, as I'm pretty sure it will never supply us with unique content. Thanks Pierpaolo
Intermediate & Advanced SEO | | madcow780 -
Replatforming possible issue with Submitting URLS
We are replatforming an ecommerce site and will need to change 90% of the urls. Many current urls contain uppercase characters and the new system forces all lowercase. We are concerned with submitting the urls all at once to google it might look like spam, receive some sort of penalty or negatively affect organic search. In the last month this site received 428k unique visitors, 3.2 mil page views and has about 10k urls. They are a top 3 competitor in their vertical. We are certainly planning to do all 301 redirects. What can we do additionally to reduce the risk of penalties here?
Intermediate & Advanced SEO | | RocketWeb0 -
Duplicate content issue - online retail site.
Hello Mozzers, just looked at a website and just about every product page (there are hundreds - yikes!) is duplicated like this at end of each url (see below). Surely this is a serious case of duplicate content? Any idea why a web developer would do this? Thanks in advance! Luke prod=company-081
Intermediate & Advanced SEO | | McTaggart
prod=company-081&cat=20 -
How to Disallow Tag Pages With Robot.txt
Hi i have a site which i'm dealing with that has tag pages for instant - http://www.domain.com/news/?tag=choice How can i exclude these tag pages (about 20+ being crawled and indexed by the search engines with robot.txt Also sometimes they're created dynamically so i want something which automatically excludes tage pages from being crawled and indexed. Any suggestions? Cheers, Mark
Intermediate & Advanced SEO | | monster990 -
SEO issues with IP based content delivery
Hi, I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site. Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me: 1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites. 2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience. 3. You tell me
Intermediate & Advanced SEO | | DevakiPhatak2