Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can external links in a menu attract a penalty?
-
We have some instances of external links (i.e. pointing to another domain) in site menus.
Although there are legitimate reasons (e.g. linking to a news archive kept on a separate domain) I understand this can be considered bad from a usability perspective.
This begs the question - is this bad for SEO? With the recent panda changes we've seen certain issues which were previously "only" about usability attract SEO penalties, but I can't find any references to this example.
Anyone have thoughts / experience?
-
Yes, this can cause a problem as the links will be classed as unnatural. The best way to get round this is to make sure those links have the rel="nofollow" tag.
The penalty you should be worried about is Penguin, as this will hit pages with overoptimised anchor text links, and a menu link to an external site could generate hundreds, if not thousands of backlinks to another site, so the site you are linking to in that menu could have problems.
There are many instances being posted here of people being hit by penguin because they got featured sidebar links on other sites, resulting in 1000's of backlinks appearing in opensiteexplorer, all with the same optimized anchor text. I liken the menu links to be similar in result.
But like I said, just ensure that the mass menu links to the other site are nofollowed and you should be ok.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find out that none of the images on my site violates copyrights? Is there any tool that can do this without having to check manually image by image?
We plan to add several thousand images to our site and we outsourced the image search to some freelancers who had instructions to just use royalty free pictures. Is there any easy and quick way to check that in fact none of these images violates copyrights without having to check image by image? In case there are violations we are unaware of, do you think we need to be concerned about a risk of receiving Takedown Notices (DMCA) before owner giving us notification for giving us opportunity to remove the photo?
Web Design | | lcourse1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Spotted Hidden Omiod Links in Footer - What do you think is Going on Here?
Hi guys, Hoping one of you have come across this before. While taking a look at the source code for a website I've recently started working on, I spotted some 'display:none' code in the footer of the page. Here's a snapshot of the code: close XMETAhead title : 404 Page Not Found | ( 39 chrs ) [http://www.omiod.com/chrome-extensions/meta-seo-inspector/info.php?meta=description&cont=404 Page Not Found.](<a href=)" title="more about description" target="_blank" class="ad_seo_link">description : 404 Page Not Found( 170 chrs )[http://www.omiod.com/chrome-extensions/meta-seo-inspector/info.php?meta=keywords&cont=404, 404 error page,](<a href=) " title="more about keywords" target="_blank" class="ad_seo_link">keywords : 404, 404 error page ( 7 items )SCRIPT![](<a href=)http://www.google.com/s2/favicons?domain=www.google-analytics.com">www.google-analytics.com http://www.google-analytics.com/ga.js <div< a="">class="ad_seo_title">HTML5 report</div<>Doctype is not HTML5, there are no HTML5 tags, but at least no obsolete HTML tags were found. 1/5
Web Design | | ecommercebc0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870 -
Footer Links Good or bad?
Hi Can anyone answer this question confidently, I know Google is moving away from lots of links within the footer. However we specialise in websites for the travel industry and having a link to all the areas at the footer can be quite handy. Our websites complete this automatically. Here is an example where due to design of the site the links don't quite fit well, so we need to change anyway. But before completing the work I wondered if there was a better way to do this. http://www.dreamvillasitaly.com/ Many thanks Andy
Web Design | | iprosoftware0 -
Over Optimization & Footer Links for Crediting Web Design to a Company
With the recent updates to the algorithm having to do with link networks and over optimization it has got me to thinking about the footer links we add to each site that we build and do web design for linking back to ours. I could certainly see how Google could make the assumption that these are all on the same server, pointing back to one main site, and penalize us for that. Should we no=follow these links? They may say something like, "Website Designed By: Company Name". They do provide a valuable source to some extent of traffic to the site from people interested in our designs. Any thoughts?
Web Design | | JoshGill270 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0