Nofollow Outbound Links on Listings from Travel Sites?
-
We oversee a variety of regional, county, and town level tourism websites, each with hundreds (or even thousands) of places/businesses represented with individual pages. Each page contains a link back to the place's main web presence if available. My fear is that a large portion of these linked to sites are low quality, and may even be spammy. With our budgets there is no way to sort through them and assign nofollows as needed. There are also a number of broken links that we try to stay on top of but at times some slip through due to the sheer number of pages.
I am thinking about adding a nofollow to these outbound links across the board. This would not be all outbound links on the website, just the website links on the listing pages.
I would love to know peoples thoughts on this.
-
Great question! We do often see a positive correlation between the number of followed outbound links and higher rankings (though I'm not sure we've scientifically measured this recently). Anecdotally, we hear this often as well. Most famously when the NYTimes made external links "followed" which was followed by an increase in traffic/rankings.
-
Thanks Cyrus
If external links are a ranking signal, do you think there would be a difference in perceived value whether external links are noFollow or doFollow, or do we expect that to make little difference?
-
It's an interesting perspective. Looking at the pages+links, they all look trustworthy and normally I wouldn't see a reason to nofollow them, especially since they are all editorially controlled by you and your team.
Linking equity is a concern, but I honestly doubt you're saving anything by making them nofollow, especially since Google updated how they handle PageRank sculpting back in 2009.
Not that there aren't legitimate ways to preserve and flow link equity (such as including internal links withing the main body of text instead of sidebar areas/navigation) but in this case I think leaving the links follow won't hurt at all.
-
Cyrus, I was actually looking to answer the statement you mentioned, "though I don't believe we've ever studied the difference between followed and nofollowed in this regard".
We've a really popular post on our site which lists hundreds of Twitter chat hours and links to the Twitter hashtag and the host in each case (http://tillison.co.uk/blog/complete-twitter-chat-hours-directory/).
Across the team, we're disagreeing whether all external links in the post should be nofollow or whether they should remain untagged and therefore dofollow. On the one hand, it feels like we're leaking page equity through every link and want to retain it, of course. On the other, nofollow kinda feels like we trust none of those links and that the page may be less valuable to the Googlebot.
I'm working through the links making them all nofollow, but would be really interested in your perspective on it.
-
Thanks, Cyrus. You have confirmed what my gut was thinking, that it likely wouldn't have much of an impact either way. The idea of testing this has been on my mind for about a year but couldn't get a strong feeling one way or the other. I would imagine that there are very few spammy sites that we are linking to but will try and dig through as time allows. Your spam score tool should help. If needed I will just nofollow specific sites that I believe may fall into this category.
Appreciate your time!
-
Good question.
On one hand, I'm a fan of linking out with, link equity. There's a good correlation with linking out and higher rankings (though I don't believe we've ever studied the difference between followed and nofollowed in this regard) I hate to see links "nofollowed" simply to protect against Google actions, but it is a reality of doing business.
To me, it comes down to how many of the sites are actual spam. "Low quality" is certainly different than spam. If it's a handful of sites out of thousands, I wouldn't worry about it too much. Generally, tourism websites are a much more trustworthy quality than sites in the gambling/adult/pharmaceutical verticals.
Now, on the other hand, if you do choose to nofollow the links, you probably won't see too many negative consequences.
In the end, I think you have to guage how bad the sites are that you're linking to, and make your judgement from there.
-
Partners, for the most part, do not pay to be listed. Those that do are in it for promotional benefits such as being listed first and other advertising perks such as email promotion. Links are never brought into the conversation.
Most of the listings we maintain are in databases and we have an internal team of developers (who built the sites in question) and some back-end tools in our CMS that help us identify the 404s. Lack of updating them is a combination of small digital marketing budgets and client staffing, lack of client assistance identifying where the links should go and political issues where some of the clients do not want us to manage their listings for various reasons. Essentially our hands are tied when it comes to updating listings (though we know that this would have the largest benefit).
Overall it is the number of lower quality websites that we need to link to to ensure that everyone in the region is represented equally. It really comes down to is if I nofollow all of them will it result in a positive impact? Will it have no effect? Or will it perceived as negative since I am essentially nofollowing hundreds or thousands of links on each of these sites?
-
Question is do your customers pay to be listed with you? If so are they using you for the dofollow links? If this is the case then you may lose some of your business by changing it to nofollow.
If they are paying there is also a risk of a Google penalty for paid do follow links.
If you are unable to maintain quality and there is no good reason to have a dofollow, then switch to nofollow.
Are the pages hardcoded? Or is all the data in a database? If it is in a database it would take no time at all to run each domain through a loop and check what response status code you get.http://en.wikipedia.org/wiki/List_of_HTTP_status_codes
This would be a very quick way to find broken links. You may even be able to purchase an api on something like majestic or Moz and run the sites through that as well for a better indication of site quality. If the site has very low DA or Trust Flow, you could also make it nofollow or remove etc...If it is all hardcoded then that would be very hard work all around.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links
Hi 64% of our links come from a .com website and only 30% from .co.uk. We only do business in the UK should I continue with the .com links as they are easier to source. Does this hurt my SEO efforts?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Google webmaster reports non-existent links between syndicated sites
We have run into an issue with linking that we are completely puzzled by. We syndicate our content to various clients, taking care to ensure that we have followed all the best practices that Google recommends for syndicating content. But recently, we noticed Google Webmaster report links from ClientA to ClientB, and we cannot figure out why it thinks that way. We have never created, and we have never found the links that Google Webmaster claims are there. It is important for us to keep our clients isolated. Has anyone seen such behavior? Any ideas/pointers/hunches would be very much appreciated. Happy to provide more information. We even asked on the Google Webmaster Forum (https://productforums.google.com/forum/#!topic/webmasters/QkGF7-HZHTY;context-place=forum/webmasters), but thought this might be a better place to get expert advice. Thanks!
Intermediate & Advanced SEO | | prakash.sikchi0 -
Does adding more outgoing links on a high PA page decrease the juice passed to previous links?
Hi, I'm not sure how PA DA exactly works when the goal is to create backlinks to your site in order to have the most impact on passing PA DA juice (if there is such a thing) to ones money site. For example let's say you have a blog and the PA is 40 DA is 30. Let's say I create a backlink pointing to my site on the homepage of this blog, in which I desire better rankings for, and the links I created are only 1-3 outgoing links on this post which is again on the homepage. Then say in a months time, I want to add another post on the homepage (so the 40 PA and 30 DA stays the same) creating a backlink to one of my other money sites. Does adding this second round of backlinks result in sending less juice to the first? This is what I want to know. Thank you!
Intermediate & Advanced SEO | | z8YX9F800 -
Would be the network site map page considered link spam
In the course of the last 18 months my sites have lost from 50 to 70 percent of traffic. Never have used any tricks, just simple white-hat SEO. Anyway, I am now trying to fix things that hadn't been a problem before all those Google updates, but apparently now are. Would appreciate any help.. I used to have a network site map page on everyone of my sites (about 30 sites). It basically would be a page called 'our network' and it'll show a list of links to all of my other sites. These pages were indexed, had decent PR and didn't seem to cause any problem. Here's an example of one of them:
Intermediate & Advanced SEO | | romanbond
http://www.psoriasisguide.ca/psoriasis_scg.html In the light of Panda and Penguin and all these 'bad links' I decided to get rid of most of them. My traffic didn't recover at all, it actually went further down. Not sure if there is any connection to what I'd done. So, the question is: In your opinion/experience, do you think such network sitemap pages could be causing penalties for link spam?0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
E-commerce site structure & link juice: Bouncing off an idea
Hi guys, Question from a new-comer in SEO. Summary of the situation: potential customers are searching for a generic product category (buy mountainbike) more often than a brand in that category (Specialized MTB). And the latter is searched more often than a specific product ('some specific product from Specialized brand'). Both the brand pages and product pages are not ranking good Then would it be a good idea to have the category pages only link to the brand pages? They may show the products, but the links wouldn't pass link juice. I'm not even sure if that is technically possible, but I wanted to figure out the merit first. I'm hoping this would support the brand pages to rank better as they take in more volume. Please do feel free to teach me!
Intermediate & Advanced SEO | | Peter850 -
Site wide internal links in footer
I have had a long discussion with a client and their external SEO partner about their current footer. They have added all their product categories, both main and sub, to the footer. From a pure SEO perspective is it still advisable, after all the pandas and penguines, to stay away from keyword important site wide footer linking to internal pages? As the links will become a repeatable element and also containing the most important keywords, isn't the links actually hurting more than helping? With 5000 index pages, it will risk "marking" the most important keywords as repeatable, lowering ranking, instead of increasing as their external part say.
Intermediate & Advanced SEO | | Macaper1 -
Is the Tool Forcing Sites to Link Out?
Hi I have a tool that I wish to give to sites, it allows the user to get an accurate idea of their credit score with out giving away any personal data and with out having a credit search done on their file. Due to the way the tool works and to make the implementation on other peoples sites as simple as possible the tool remains hosted by me and a one line piece of Javascript code just needs to be added to the code of the site wishing to use the tool. This code includes a link to my site to call the information from my server to allow the tool to show and work on the other site. My questions are: Could this cause a problem with Google as far as their link quality goes? - Are we forcing people to give us a backlink to use the tool? (in the eyes of Google) or will Google not be able to read the Javascript / will ignore the link for SEO purposes? Should I make the link in the code Nofollow? If I should make the link a Nofollow any tips on how to make the most of the opportunity from a link building or SEO point of view? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0