Using rel="nofollow"
-
Hello,
Quick question really, as far as the SERPs are concerned If I had a site with say 180 links on each page - 80 above suggested limit, would putting 'rel="nofollow"' on 80 of these be as good as only having 100 links per page?
Currently I have removed the links, but wereally need these as they point to networked sites that we own and are relevant...
But we dont want to look spammy...
An example of one of the sites without the links can be seen here
whereas a site with the links can be seen here
You can see the links we are looking to keep (at the bottom) and why...
Thanks
-
Sorry, by "bigger problems" I just meant the potential link-farm.
The nofollow will remove the SEO risk - you'll still lose a little link-juice to those links, but you won't get penalized down the road for having them. Of course, you won't gain any SEO value from the cross-linking either. At this point, though, I think that's inevitable. The risk is greater than the reward from cross-linking this many domains.
Any other ways to block the links are going to look more suspicious to Google than nofollow (including iFrames). Any I can think of would be best avoided in this scenario.
Any way you can contextually cross-link would create less SEO risk and potentially let you get some ranking value out of the connections. That's why I suggested links at the job listing level. I think that might benefit users a bit more, too. Even then, you don't want to go overboard.
-
Hi Dr Peter
Thanks for the detailed response. A few questions, you say I have bigger problems than the 100 links/page - is that just that the sites are at risk of looking like a link farm, or are there bigger problems still?
I hear you on the fact that links in the footer carry less weight and on the consolidating aspect, and it is something that we are working on, but in the mean time, I would really like to find a way if possible to keep some form of cross connecting the sites.. We do have the detail pages which we don't need to be SEO primed, really we only need the -
SEO primed, you can see the different phrases (search patterns) that we are targeting; each site has hundreds of pages like this, that we don't necessarily need primed as these are only live for 28 days..
Is there an option to either include these links in either an iframe in the footer area (for user reference only) or on the detail pages?
Any other options that will work, that will not result in the sites being at risk of looking like a link farm?
I appreciate your insight.
Many thanks
-
You've got bigger problems here than 100 links/page (that's just a guideline) - you're cross-connecting enough sites that it looks like a link farm. Having them all in the footer only adds to the problem, and makes the tactic look lower-quality. I'd definitely no-follow these, as you could potentially be penalized.
The nofollow won't really help with the 180 links - it'll still burn up link-juice. It'll just keep these links from getting you into trouble. Realistically, these links are probably already being devalued by the algorithm.
Practically, being in the footer, these links may not have a ton of value for visitors (if you click-mapped the page, I'm betting the CTR is very low). I wonder if there's a way to integrate them. For example, when someone clicks through to a job listing in Kent, having something like "See more jobs in Kent" on that page and link it to: http://www.kentjobsonline.net/.
From an SEO standpoint, these geo-targeted microsites have lost a lot of value over the past couple of years. I've even seen the strategy run into Panda issues. You may want to re-evaluate down the road and consider consolidating.
-
They are all legitimate links, all 74 sites act as one larger site, or a network of sites. But the thing is when I removed the links I moved up in the SERPs...
What would be the best way of showing te links to visitors but hiding them from SERPs?
Thanks
-
Quality is MUCH better than quantity
-
You should build the site for functionality and if there is a legitimate reason to have all those links then feel free to put them in. I wouldn't over think this.
-
I think nowadays rel=nofollow there is no meaning. With the changes in google, also the link juice is not been affect by it. You can check some info in this old post: http://www.seomoz.org/q/do-you-use-nofollow-and-rel-nofollow
But in SEO's live everything is gray, and not black in white. Its better for you using 100 links per page and go with more security.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google-selected canonical different to user-declared" - issues
Hi Moz! We are having issues on a number of our international sites where Google is choosing our page 2 of a category as the canonical over page 1. Example; https://www.yoursclothing.de/kleider-grosse-groessen (Image attached). We currently use infinite loading, however when javascript is disabled we have a text link to page 2 which is done via a query string of '?filter=true&view=X&categoryid=X&page=2' Page 2 is blocked via robots.txt and has a canonical pointing at page 1. Due to Google selecting page 2 as the canonical, the page is no longer ranking. For the main keyphrase a subcategory page is ranking poorly. LqDO0qr
On-Page Optimization | | RemarkableAgency1 -
Can we use internal no-follow links without negatively affecting rankings
we are creating a site structure for a travel website. the site homepage has a top navigation bar with 8 top level links and a total link count of 33 links in this (within menus). There are also 10 footer and ad-hoc links As this top navigation bar is a site-wide navigation, when entering s specific "travel destination" page, the "travel destination" page has its own contextual links and reference links, making the total inks on the destination page approx 107. do you think its ok to make all links in the top navigation bar no follow on all pages except the homepage? how would you approach this to create less links to maintain effective link-juice flow to required pages
On-Page Optimization | | Direct_Ram0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Is it impossible to get out of Panda? Matt Cutts says if you fix the problem you "pop back" but if so why are their so few examples?
In this video matt cutts says: http://www.youtube.com/watch?v=8IzUuhTyvJk about 15 "once we re-run our data (every few weeks) if we determine your site is of higher quality you would pop back out of being affected" Panda has effected thousands of sites and a lot of smart people have been working on the problem for about 2 years since the first panda was launched, but I can only find 1 site that has "popped back" to their original rankings. e.g. http://searchengineland.com/google-panda-two-years-later-losers-still-losing-one-real-recovery-149491 Apart from Motortrend.com I can't find any sites (of reasonable size) / case studies of sites that have solved the panda problem, and were definitely hit by panda. Which doesn't feel right, some people have deleted a ton of pages, redesigned their site, improved their content, etc with no success. Therefore is it a pointless exercise? Therefore, is it better to simply give up and start a new site?
On-Page Optimization | | julianhearn1 -
Recommended Min Amount of Content for "News" Bulletins
My company often puts out short news bulletins to announce short news updates. We have to write about these topics for our customers and to remain as an industry leader. However, there is not much real and interesting content to write about these topics. What is the minimum length you think these articles should consist of so that Google won't see them as weak/useless pages and possibly give us a Panda penalty for them?
On-Page Optimization | | theLotter0 -
Can you expound why i have to avoid using meta keywords?
I'm using the on page report card and it tells me that i have to avoid using meta keywords.I'm a little bit confused. I thought that it's important to use it all the time so search engine can better index the site. if I use SEO Quake it will tell me in the diagnostic test that I need to input keywords.
On-Page Optimization | | jsevilla0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
What's the best practice for implementing a "content disclaimer" that doesn't block search robots?
Our client needs a content disclaimer on their site. This is a simple "If you agree to these rules then click YES if not click NO" and you're pushed back to the home page. I have this gut feeling that this may cause an upset with the search robots. Any advice? R/ John
On-Page Optimization | | TheNorthernOffice790