Internal Linking: Site-wide VS Content Links
-
I just watched this video in which Matt Cutts talks about the ancient 100 links per page limit.
I often encounter websites which have massive navigation (elaborate main menu, side bar, footer, superfooter...etc) in addition to content area based links.
My question is do you think Google passes votes (PageRank and anchor text) differently from template links such as navigation to the ones in the content area, if so have you done any testing to confirm?
-
He also said: "We invite and strongly encourage readers to test these themselves."
This is what I am after, personal opinion from people who have either tested or experienced the effect first hand.
-
it is a thought that there is an importance if the links apear at the begining of body or at end and if they are in specific tags. How do you specify to crawler that a speciffic link is from a navbar and that link has an bigger value than other content links?
-
Rand has written a blog about this a while ago, how not all links on webpages are created equal, you might find it interesting:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
-
Thanks for your input!
It seems like your vote goes towards all links being treated equally regardless of their location/function. Interesting... I have suspicion that there is or should be difference. Why?
Consider this, Google notices 150 sitewide links that always appear. Wouldn't it make sense for Google to treat page-specific links differently to sitewide ones as that would in fact improve their ranking system (e.g. 150 standard links not diluting the importance of a page specific link given through content).
Thoughts?
-
Many of this massive navigation are made in flash, javascript, and google can't see them as links, then it conts them as 1 link or as refference to javascript file and nothing else, that's how its done to have massive links but not seen by google or you can set nofollow to non preffered links then google will analyze different your page. And the answer is No, Links are Links everywhere only difference is the tag that link contains, and you can test this with tools like spider view try one on 2 pages and you'll se that there is no difference
Best,
Ion
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to find all internal links to a specific page (without index)
Hi guys -- Still waiting on Moz to index a page of mine. We launched a new site over two months ago. In the meantime, I really just need a list of internal links to a specific page because I want to change its URL. Does anybody know how to find that list (of internal links to 1 of my pages) without the Moz index? I appreciate the help!
Technical SEO | | marchexmarketingmcc1 -
Migrating to new subdomain with new site and new content.
Our marketing department has decided that a new site with new content is needed to launch new products and support our existing ones. We cannot use the same subdomain(www = old subdomain and ww1 = new subdomain)as there is a technically clash between the windows server currently used, and the lamp stack required to run the new wordpress based CMS and site. We also have an aging piece of SAAS software on the www domain which is makes moving it to it's own subdomain far too risky. 301's have been floated as a way of managing the transition. I'm not too keen on that idea due to the double effect of new subdomain and content, and the SEO impact it might have. I've suggested uploading the new site to the new subdomain while leaving the old site in place. Then gradually migrating sections over before turning parts of the old site off and using a 301 at that point to finalise the move. The old site would inform user's there is a new version and it would then convert them to the new site(along with a cookie to auto redirect them in future.) while still leaving the old content in place for existing search traffic, bookmarks and visitors via static URLs. Before turning off sections on the old site we would create rel canonicals to redirect to the new pages based on a a mapped set of URLs(this in itself concerns me as the rel canonical is essentially linking to different content). Would be grateful for any advice on whether this strategy is flawed or whether another strategy might be more suitable?
Technical SEO | | Rezza0 -
I have 2 E-commerce sites - Can i cross link?
Good Morning Everyone, I have 2 e-commerce websites that are similar and sell the same products. The content (text/descriptions/titles) is different so the content is not duplicate. SITE A has a ton of blog posts with highly relevant information and we frequently update the blog with posts about the types of products we carry and how it can help people in their daily lives... SITE B has no blog posts, but the content on the blog from SITE A is extremely relevant and helpful to anyone using SITE B. My question is, do you think it is frowned upon if i were to add links on SITE B that point to specific posts on SITE A... For example, if you are browsing a category page on SITE B, i was thinking of adding links on the bottom that would say "For More Information, Please Check Out These Posts on our Blog" www.sitea.com/blog/relevantinfo1 www.sitea.com/blog/relevantinfo2 www.sitea.com/blog/relevantinfo3 I think this would seriously help our browsers and potential customers get all of the information that they need, but what do you think Google would think about this cross-linking and if it violates their guidelines? Thanks for any opinions and advice.
Technical SEO | | Prime850 -
Duplicate Content - Mobile Site
We think that a mobile version of our site is causing a duplicate content issue; what's the best way to stop the mobile version being indexed. Basically the site forwards mobile users to "/mobile" which is just a mobile optimised version of the original site. Is it best to block the /mobile folder from being crawled?
Technical SEO | | nsmith7870 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
How much effect does number of outbound links have on link juice?
I am interested in your thoughts on the effect of number of outbound links (obls) on link juice passed? ie If a page linking to you has a high number of obls, how do you compute the effect of these obls and relative negative effect on linkjuice. In the event that there are three sites on which you have been offered the opportunity of a link Site A PA 30 DA50 Obls on page 10 Site B PA 40 DA50 Obls on page 15 Site C PA 50 DA50 Obls on page 20 How would you appraise each of these prospective page links (ignoring anchor text, relevancy, etc which will be constant) Is there a rule of thumb on how to compare the linkjuice passed from a site relative to its PA and the number of obls? Is it as simple as page with 10 obls passes 10x juice of page with 100 obls?
Technical SEO | | seanmccauley0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0