Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Breadcrumbs and internal links
-
Hello,
I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough.
Thank you,
-
Thanks for your comment Paul
-
Glad to help
-
Thank both for your answers. There are very helpful and all is clear. I know now that it is best to have both.
-
I think Roman's response is thorough and well reasoned. I'm a content strategist (not a designer or developer), so I like the way his answer puts the user front and center. Bottom line: do in-text links and bread crumb links both help users? Yes, depending where you are on the page and how deep the page is. My instinct on bread crumbs is that their especially helpful once you get a couple pages deep in a site and a user might start to get a bit disoriented. My in-text links are often more driven by the content itself, what will provide added value to the user (or potentially SEO value to another page on the site). Hope that's helpful.
-
As I see you have question about duplicated links and the answer depends on your needs let me explain my point.
Why Redundant Links on the Same Page Are a Good Idea. There are many reasons why you might want to show duplicate links on the same page. Here are some common motivations
- Provide safety nets: If people don’t notice the link the first time, maybe they will notice the second occurrence as they scroll the page. The redundancy may minimize individual differences: one person might notice the link at the top, while another person might notice it at the bottom. Showing links in multiple places is thus hypothesized to capture a broader audience.
- Deal with long pages: Having to scroll all the way up to the top of an overly long page is time-consuming. Offering users alternative ways to access links will help alleviate the pain.
- Create visual balance: Empty space is common on top-level (wayfinding) pages, where content might be sparse or nonexistent. Filling in awkward white space with extra copies of links will make the page look more balanced
- **Follow the evidence: **Analytics show that traffic to desired destination pages increase when links to them are duplicated.
Why Redundant Links Are a Bad Idea (Most of the Time)
Redundancy can be good or bad depending on when it’s applied. Each of the explanations above may sound reasonable. However, relying on redundancy too frequently or without careful consideration can turn your site into a navigation quagmire.What’s the big deal about having a few duplicate links on the page?
- Each additional link increases the interaction cost required to process the link because it rises the number of choices people must process. The fewer the choices, the faster the processing time.
- Each additional link depletes users’ attention because it competes with all others. Users only have so much attention to give and often don’t see stuff that’s right on the screen. So when you grab more attention for one link, you lose it for the others: there’s substantial opportunity cost to extra linking.
- Each additional link places an extra load on users’ working memory because it causes people to have to remember whether they have seen the link before or it is a new link. Are the two links the same or different? Users often wonder if there is a difference that they missed. In usability studies, we often observe participants pause and ponder which they should click. The more courageous users click on both links only to be disappointed when they discover that the links lead to the same page. Repetitive links often set user up to fail.
- Extra links waste users’ time whenever users don’t realize that two links lead to the same place: if they click both links, then the second click is wasteful at best. At worst, users also don’t recognize that they’ve already visited the destination page, causing them to waste even more time on a second visit to that page. (Remember that to you, the distinctions between the different pages on your site are obvious. Not so for users: we often see people visit the same page a second time without realizing that they’ve already been there.)
**CONCLUSION **
Sometimes navigation is improved when you have more room to explain it. If this is the case, duplicating important navigational choices in the content area can give you more flexibility to supplement the links with more detailed descriptions to help users better understand the choices.
Providing redundancy on webpages can sometimes help people find their way. However, redundancy increases the interaction cost. Duplicating links is one of the four major dangerous navigation techniquesthat cause cognitive strain. Even if you increase traffic to a specific page by adding redundant links to it, you may lose return traffic to the site from users who are confused and can’t find what they want.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a page with links to all posts okay?
Hi folks. Instead of an archive page template in my theme (I have my reasons), I am thinking of simply typing the post title as and when I publish a post, and linking to the post from there. Any SEO issues that you can think of? Thanks in advance!
Intermediate & Advanced SEO | | Nobody16165422281340 -
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Rankings rise after improving internal linking - then drop again
I'm working on a large scale publishing site. I can increase search rankings almost immediately by improving internal linking to targeted pages, sometimes by 40 positions but after a day or two these same rankings drop down again, not always as low as before but significantly lower than their highest position. My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or is this unlikely? Does anyone else have experience of this phenomenon or any theories?
Intermediate & Advanced SEO | | hjsand1 -
How to detect a bad link and remove ?
As per google penguin, all the low quality back links are going to affect the website SERPS hugely. So, we need to find all the bad back links and then remove them one by one. What I would like to know is, what tool do you use to find all the bad back links ? And how do we know which is a bad back link or bad website, where our link should not be there ? Then what service what do you suggest for back links removal. I contacted LinkDelete.com and they quoted me 97$ for a month to remove all links in less than 3 weeks.
Intermediate & Advanced SEO | | monali123
Let me know, what you suggest.0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Outbound Links to Authority sites
Will outbound links to a related topic on an authority site help, hurt or be irrelevanent for SEO purposes. And if beneficially, should it be Nofollow?
Intermediate & Advanced SEO | | VictorVC0 -
How Google treat internal links with rel="nofollow"?
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner. "Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)." It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search. But, what about internal links? I have defined rel="nofollow" attribute on too many internal links. I have archive blog post of Randfish with same subject. I read following question over there. Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007] A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
Intermediate & Advanced SEO | | CommercePundit
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.) Matt has given excellent answer on following question. [In 2011] Q: Should internal links use rel="nofollow"? A:Matt said: "I don't know how to make it more concrete than that." I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all. For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course. I am still using nofollow attributes on my website. So, what is current trend? Will it require to use nofollow attribute for internal pages?0