Can I use nofollow to limit the number of links on a page?
-
My website is an ecommerce and we have on homepage about 470 links !
1. We have a top bar with my account, login, faq, home, contact us and link to a content page.
2 . Then we have multistore selection
3. Then we have the departament menu, with several parants + child category links
4. Then we have a banner
5. Then we have a list of the recently sold and new products.
6. then we have an image grid with the most important cms/content pages (like faq, about us, etc)
7. then we have footer, with all info pages, contact us, about us, my account etc.
There are some links that are repeted 2, 3 times. For a user it is easier to find the informations but I'm not sure how search bots (google) deal with that.
So I was thinking on how can I have around 150 links to be followed. To remove the links from the page is not possible. What about to add nofollow to repeted links and some child category, as the spider will crawl the father and will access child on the next page?
Is this a good strategy?
-
That's something you'd want to test with your users. Generally though, the easier it is to get from Point A to Point B to Sale the better.
-
But what is better? user experiance (offering easiest way to find products on menu) or less links (users will have to throught 2 page levels to find the products) ?
Menu > category 2 > product
Menu > category 1 > category 2 > product
-
I think Wikipedia does alright using nofollow... ;^) Still, Paulo would probably be better served by limiting the number of links on a given page, preferably by design to promote certain aspects and conversion paths. All in all though, this is likely a pretty minimal expected improvement.
-
DO NOT use nofollow within a website. You will bleed PageRank.
If you have 5 links dividing 100% pagerank. each will have 20%. if you nofollow one the resulting pagerank from your 5 links is only 80%. This means that you bleed 20% from that page alone.
If you use noindex, please do use dofollow. This way the link value will be reused (in menu and footer pages)
-
Sure, you can nofollow duplicate links, but the benefits are fairly minimal. Gaining domain and page authority and making sure page load speeds are at their fastest possible levels will go a lot farther than nofollow, in terms of priorities. Cheers!
-
Thanks for the coments. I've checked the links you sent me. I have made noindex the pages you said, like cart, checkout, login...
My question is related to too many links on a page. Can I reduce it nofollowing links that appear more than once?
-
Hi Paulo. Typically allowing links to be followed is fine, but you might consider using noindex instead for pages that would be drawing search equity away from your target landing pages. Good candidates for noindex would be: my account, login, duplicate / close to duplicate category listings, and so on. If you really want to limit your links, it's best to do so from design and CRO point so that people are better using and converting within your site.
Another thing to consider is setting up canonical tags for your root product and category pages. Lindsay Wassell wrote a nice article discussing pros and cons of this technique in addition to others here: https://moz.com/blog/restricting-robot-access-for-improved-seo
Still another resource--while specifically for Magento--is here: https://moz.com/ugc/setting-up-magento-for-the-search-engines You'll find a lot of parallels though between the settings discussed there and your own ecommerce implementation. Hopefully these help point you in the right direction. Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I link G+ and social accounts to internal pages?
Hi Gang, We are a multi-state law firm (Indiana, Michigan, New Mexico) that is looking to increase ranking for Michigan and New Mexico for 2015. (Michigan and New Mexico used to exist as subdomains under 2keller.com but have now been incorporated as pages under the main domain.) The question is whether or not we link directly to what we call the "main state pages" from various accounts like G+, FB, Yellow Pages, Yelp, etc.? For example, should we link from our New Mexcio G+ page to our New Mexico main page, or should it be linked to the main domain at 2keller.com where our general branding is on display? I seem to remember somewhere seeing a Whiteboard Friday or some other forum in which Rand had spoke to something similar. I believe he advised AGAINST trying to control the "user experience" for instances like these, but I can't locate same. I understand that clients/customers are looking for our main brand, but it would seem to be the better experience to get users in state specific locales to their state specific page. Am I thinking wrong about this? Thanks in advance! Wayne
On-Page Optimization | | Wayne760 -
Do javascript pseudo-links dilute link juice ?
Hi, On our ecommerce, we use multiple pseudo-links for the layered navigation (to filter by color, site, etc), so that google doesn't crawl every combination of filters. I know this kind of links don't pass link juice and don't get crawled (provided you hide the target urls in your javascript). But, as there is an "onclick" property, I'm afraid that google could understand that these are links, and treat them the same way as nofollowed links (not following them but diluting link juice anyway). Do you know if this is the case ? Thanks,
On-Page Optimization | | Strelok0 -
How can my national ecommerce site geo target specific states for specific pages?
If I am selling sports related products for sports/teams across the country, how can I target specific states or cities through various pages on my website. I am using adwords but would like to use increase organic search results.
On-Page Optimization | | briggsb0 -
Should I have content on my home page or links to my articles
Hi, i have asked this question a couple of times without any luck so i am hoping third time lucky. My site www.in2town.co.uk has dropped in the rankings for two of my important keywords, lifestyle magazine and lifestyle news, so i am just wondering if i have to much content on the page for google to understand what the page is about. i am thinking to just have the links on my page instead of the intro to the articles, for example another online magazine does this, http://www.femalefirst.co.uk/ Can anyone please let me know if i should keep the intro to the articles or if i should go with the links idea like femalefirst does to help google understand that we are a lifestyle magazine any advice would be great
On-Page Optimization | | ClaireH-1848860 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Header Links vs. In Page Links
We have lost considerable rank for some of our top search terms (department names) and the rank loss correlates to a change we made on our homepage. That change was to remove a secondary navigation to the major departments in the content of our homepage. Now all we have is the global header navigation on the homepage (and all other pages on the site). I have read that in-page links pass more value than sitewide header links and I'm wondering if this is really true. These were text links (not linked images) and our header also contains text links (and some javascript). We did not make any other changes on our site at this time and this was not around the time of any major algorithm updates. The site is www.ebags.com.
On-Page Optimization | | SharieBags1 -
Limiting On Page Links
Right now, we have about 160 or so links on the home page. It's been recommended that we keep it to under 100, though that's not as big of a deal as it once was. Is it helpful to make a bunch of those links "nofollow" in order to preserve link juice? Is it going to make a difference, or be at all helpful? I assume it won't be harmful, especially as a bunch of them are to the same page but on different sections of the page. Would live your advice and thoughts! Thanks!
On-Page Optimization | | DeliaAssociates0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5