Is there an optimal ratio of external links to a page vs internal links originating at that page ?
-
I understand that multiple links fro a site dilute link juice. I also understand that external links to a specific page with relevant anchortext helps ranking. I wonder if there is an ideal ratioof tgese two items
-
I understand that multiple links fro a site dilute link juice.
Some people think that Google still operates this way. Nobody knows for sure.
However, multiple links to excellent external targets could have benefits that are enormous when compared to linkjuice loss.
On important pages I don't hesitate to place multiple links to authoritative, relevant and trusted targets.
I wonder if there is an ideal ratioof tgese two items.
No.
One should maximize both within reason and do what will impress the visitor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reasonable to Ask URL of Link from SEO Providing New Links before Link Activation?
My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Optimize Pages for Keywords Prior to Building Links?
Greetings MOZ Community: According to site audit by a reputable SEO firm last November, my commercial real estate web site has a toxic link profile which is very weak (about 58% of links qualified as toxic). The SEO firm suggests than we immediately start pruning the link profile, requesting removal of the toxic links and eventually filing a link disavow file with Google for links that web masters will not agree to remove. While removing toxic links, the SEO firm proposes to simultaneously solicit very high quality links, to try to obtain 7-12 high quality links per month. My question is the following: is it putting the cart before the horse to work on link building without optimizing pages (with Yoast) for specific keywords? I would think that Google considers how each page is optimized for specific terms; which terms are used within the link structure, as well as terms within the meta tags. My site is partially optimized, but optimization has never been done thoroughly. Should the pages of the site be optimized for the top 25-30 terms before link building begins. Or can that be done at a later stage. Note that my link profile is pretty atrocious. My site at the moment is receiving about 1,000 unique visitors a week from organic search. However 70% of the traffic is from terms that are not relevant. The firm that did my audit claims that removal of the toxic links while building some new links is imperative and that optimization for keywords can wait somewhat. Any thoughts?/ Thanks for your assistance. Alan
Intermediate & Advanced SEO | | Kingalan10 -
Transferring link juice from a canonical URL to an SEO landing page.
I have URLs that I use for SEM ads in Google. The content on those pages is duplicate (affiliate). Those pages also have dynamic parameters which caused lots of duplicate content pages to be indexed. I have put a canonical tag on the Parameter pages to consolidate everything to the canonical URL. Both the canonical URL and the Parameter URLs have links pointing to them. So as it stands now, my canonical URL is still indexed, but the parameter URLs are not. The canonical page is still made up of affiliate (duplicate) content though. I want to create an equivalent SEO landing page with unique content. But I'd like to do two things 1) remove the canonical URL from the index - due to duplicate affiliate content, and 2) transfer the link juice from the canonical URL over to the SEO URL. I'm thinking of adding a meta NoIndex, follow tag to the canonical tag - and internally linking to the new SEO landing page. Does this strategy work? I don't want to lose the link juice on the canonical URL by adding a meta noindex tag to it. Thanks in advance for your advice. Rob
Intermediate & Advanced SEO | | partnerf0 -
On Page vs Off Page - Which Has a Greater Effect on Rankings?
Hi Mozzers, My site will be migrating to a new domain soon, and I am not sure how to spend my time. Should I be optimizing our content for keywords, improving internal linking, and writing new content - or should I be doing link building for our current domain (or the new one)? Is there a certain ratio that determines rankings which can help me prioritize these to-dos?, such as 70:30 in favor of link-building? Thanks for any help you can offer!
Intermediate & Advanced SEO | | Travis-W0 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0 -
How does one know where to insert the right strips of coding on the right pages for Canonical Links?
On my Website, I am the only SEO optimizer wizard person. I have to teach myself everything and I get overwhelmed a lot. I recently started using SEOMOZ and on my report it stated we had duplicate page titles and that it was bad and should be fixed quickly. So I did my research and found that I needed to use canonical links to reference one page to be indexed. However my problem lies in exactly how to add this coding to my site. I greatly appreciate any help or at least looking at this question.
Intermediate & Advanced SEO | | FrontlineMobility0 -
Increasing Internal Links But Avoiding a Link Farm
I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?
Intermediate & Advanced SEO | | rball10 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0