Prevent link juice to flow on low-value pages
-
Hello there!
Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages.
I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO.
What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this?
- Put a rel="nofollow" attribute on those links?
- Put a "robots" meta tag containing "noindex,nofollow" on those pages?
- Put a "Disallow" for those pages in a "robots.txt" file?
- Use efficient Javascript links? (that crawlers won't be able to follow)
-
Mmh, good point. Never heard that "privacy policy page" could be a trust signal. Is there an article somewhere that talks about this?
Well, I took those two pages as an example... but my question was about avoiding link juice to flow on non-SEO pages in general.
Thanks a lot for your answers!
-
Exactly, and what I also try to explain to people is that privacy policy type page is additional signal for Google when they try to understand the type of site you are and how trustworthy it is. Why in the world would you noindex something like that?
-
As I understand it nofollow still dilutes your link juice even though it does not pass PR (theoretically).
Google made this announcement to combat PR sculpting in 2009. Here is a post from Rand about it.
Unlsee something has changed that I am not aware of you could link in an iFrame and Google will not see it, nor will it dilute your PR passed out.
-
Great suggestions. I've recently combined some pages (login/register, about/contact/ToS/privacy, and a few others) and have been very happy with the results. I removed 8 links from every page.
I am also thinking about removing some more links from my product pages, to try and keep the most juice on those pages. Those pages don't need the same navigation as the homepage.
-
It depends on what your purpose is.
If you want them totally block from being index then putting the page in the robots.tx fil or using a robots meta tag would work fine.
If you just want to de-emphasize the page to the search engines you can use nofollows or javascript links on footer/header links.
One thing that we have done is to combine some of these pages (terms and privacy) into one page to cut down on the number of total links on each page.
You could also not include the privacy page link on every page (depending on your site) but just link it from certain pages that collect sensitive data (near the form).
I hope this helps. The main thing to remember is that each site is different so you will have to adjust your tactics depending on precisely what you are trying to accomplish.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
For an e-commerce product category page that has several funnels to specific products, for SEO purposes does it matter whether the category page's overview content is above or below those funnels?
We manage an e-commerce site. On a category page, there are several funnels to specific products. We moved the category overview content below those funnels to make it easier for users to quickly get to products. Seems more user friendly to me, but could that move of the main content to the lower part of the page be a negative ranking factor?
On-Page Optimization | | PKI_Niles0 -
Sub-pages have no pa
I took over a website a few months ago which is not performing well at all for chosen keywords. When I first inspected it, I found a rel canonical tag pointing to the homepage on every page. This was quickly deleted and all the pages were fetched in webmaster tools. 3 months later and the website is still performing badly. When I use the mozbar, it shows that all of the sub-pages have a pa of 1. It is only a small site and all of the pages are linked to on the navbar in a simple way. The links are not made using javascript and all the pages are on the sitemap which is submitted to wmt. I have checked that all of the changes that have been made have been indexed as well. Could it be possible that google still sees the canonical tag even though its not there? I can't think of any other reason why the pages have no pa or why it is so far behind the competitors despite having better content and links. Also, the site is appropriate for adults, but I found (among the mess left for me) a meta ratings tag set to "general". This has now been deleted, could it negatively affect rankings?
On-Page Optimization | | maxweb0 -
Ecommerce- Keyword use in Product links on Category page
I'm wondering how Keyword use in Product links on Category pages can affect a pages rank? I have 1 site where this seems to be an issue but not on all categories. For this site, a site: keyword search ranks the category page as no.1 in the SERPS but a non-site: search shows 1 of the many products within the category as the highest ranking page (currently 20 in google) on this site. This product is probably the least likely to generate a conversion due to it's cost so this is less than ideal. The plural search of the keyword shows the category page and it ranks higher than the keyword itself (currently 9 in google) Category name and URL = keyword. The category is paginated with 12 products per page. Product URL and anchor text is brand-model-type (where type = keyword) I'd like to keep the product URLs and anchors as they are if I can as they are well searched terms themselves but I want to optimize a category page to rank for the keyword itself. Have any of you overcome a similar issue? Would adding more text to the category page dilute the issue?
On-Page Optimization | | MarcOZ0 -
Different pages for OS's vs 1 Page with Dynamic Content (user agent), what's the right approach?
We are creating a new homepage and the product are at different stages of development for different OS's. The value prop/messaging/some target keywords will be different for the various OS's for that reason. Question is, for SEO reasons, is it better to separate them into different pages or use 1 page and flip different content in based on the user agent?
On-Page Optimization | | JoeLin0 -
Product Page Optimization
I work for an ecommerce site and we are currently in the process of redesigning our product page. Any useful, must-do tips for this? If it helps, our site has both hard goods and apparel that can be imprinted and customized to the buyers liking. Thanks for any help!
On-Page Optimization | | ClaytonKendall1 -
Too Many On-Page Links
Hi All, New to SEOMoz, so thanks in advance for any answers! Looking at our Crawl Diagnostics and "Too Many On-Page Links" is first on the list. The site was build with the intention of users being able to quickly get to where they want to go with drop down menus (sub nav), so we built the navigation using bullet points/css. Yes, agreed there are too many links on each page from our navigation, main nav cats are 4 with sub nav about 40, but what is the best way to resolve the problem other then removing most of the links (from the sub nav drop down)? Could we just use the attribute rel=nofollow for the sub nav links? TIA
On-Page Optimization | | bmmedia0 -
Alternatives for having less then 100 links per page
Guys, I'm aware of the recomendation of having <100 links per page. The thing is I'm running a vacation rental website (my clients pay me to advertise their properties on my website). We use an AJAX interface with pagination to show the properties. So I have cities that have +400 properties on them... the pagination works fine but google can't crawl trough it (there is a google doc about making ajax systems crawlable, but that would invove a huge rewrite of our code and I dont understand how it helps the SEO). So my question is: what do I do to mantain each property having at least one link pointing to them at the same time that I keep the # of links in each page <100 ? Any suggestions ?
On-Page Optimization | | pqdbr0 -
Results in the On-Page
I put one site (www.fmredesdeprotecao.com.br) and register some keywords, but the keywords don't appear in the results On-Page like the other campaigns. How can I solve that?
On-Page Optimization | | Ex20