Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Breadcrumbs and internal links
-
Hello,
I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough.
Thank you,
-
Thanks for your comment Paul
-
Glad to help
-
Thank both for your answers. There are very helpful and all is clear. I know now that it is best to have both.
-
I think Roman's response is thorough and well reasoned. I'm a content strategist (not a designer or developer), so I like the way his answer puts the user front and center. Bottom line: do in-text links and bread crumb links both help users? Yes, depending where you are on the page and how deep the page is. My instinct on bread crumbs is that their especially helpful once you get a couple pages deep in a site and a user might start to get a bit disoriented. My in-text links are often more driven by the content itself, what will provide added value to the user (or potentially SEO value to another page on the site). Hope that's helpful.
-
As I see you have question about duplicated links and the answer depends on your needs let me explain my point.
Why Redundant Links on the Same Page Are a Good Idea. There are many reasons why you might want to show duplicate links on the same page. Here are some common motivations
- Provide safety nets: If people don’t notice the link the first time, maybe they will notice the second occurrence as they scroll the page. The redundancy may minimize individual differences: one person might notice the link at the top, while another person might notice it at the bottom. Showing links in multiple places is thus hypothesized to capture a broader audience.
- Deal with long pages: Having to scroll all the way up to the top of an overly long page is time-consuming. Offering users alternative ways to access links will help alleviate the pain.
- Create visual balance: Empty space is common on top-level (wayfinding) pages, where content might be sparse or nonexistent. Filling in awkward white space with extra copies of links will make the page look more balanced
- **Follow the evidence: **Analytics show that traffic to desired destination pages increase when links to them are duplicated.
Why Redundant Links Are a Bad Idea (Most of the Time)
Redundancy can be good or bad depending on when it’s applied. Each of the explanations above may sound reasonable. However, relying on redundancy too frequently or without careful consideration can turn your site into a navigation quagmire.What’s the big deal about having a few duplicate links on the page?
- Each additional link increases the interaction cost required to process the link because it rises the number of choices people must process. The fewer the choices, the faster the processing time.
- Each additional link depletes users’ attention because it competes with all others. Users only have so much attention to give and often don’t see stuff that’s right on the screen. So when you grab more attention for one link, you lose it for the others: there’s substantial opportunity cost to extra linking.
- Each additional link places an extra load on users’ working memory because it causes people to have to remember whether they have seen the link before or it is a new link. Are the two links the same or different? Users often wonder if there is a difference that they missed. In usability studies, we often observe participants pause and ponder which they should click. The more courageous users click on both links only to be disappointed when they discover that the links lead to the same page. Repetitive links often set user up to fail.
- Extra links waste users’ time whenever users don’t realize that two links lead to the same place: if they click both links, then the second click is wasteful at best. At worst, users also don’t recognize that they’ve already visited the destination page, causing them to waste even more time on a second visit to that page. (Remember that to you, the distinctions between the different pages on your site are obvious. Not so for users: we often see people visit the same page a second time without realizing that they’ve already been there.)
**CONCLUSION **
Sometimes navigation is improved when you have more room to explain it. If this is the case, duplicating important navigational choices in the content area can give you more flexibility to supplement the links with more detailed descriptions to help users better understand the choices.
Providing redundancy on webpages can sometimes help people find their way. However, redundancy increases the interaction cost. Duplicating links is one of the four major dangerous navigation techniquesthat cause cognitive strain. Even if you increase traffic to a specific page by adding redundant links to it, you may lose return traffic to the site from users who are confused and can’t find what they want.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
How To Implement Breadcrumbs
Hi, I'm looking to implement breadcrumbs for e-commerce store so they will appear in the SERP results like the attached image. In terms of implementing to a site, would you simply add HTML to each page like this Google example? Which looks like this: Books › Science Fiction Award Winners Then is there anything you need to do, to get this showing in the SERPs results e.g. doing something in search console. Or do you just wait into google has crawled and hopefully starts showing in the SERPs results? Cheers. wn3ybMMOQFW98fNQkxtJkA.png [SERP results with bread crumbs](SERP results with bread crumbs)
Intermediate & Advanced SEO | | jaynamarino0 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Internal links and URL shortners
Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages). I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically. Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks? Any advice much appreciated, I just wanted to double check this.
Intermediate & Advanced SEO | | pauledwards0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740