Will adding 1000's of outbound links to just a few website impact rankings?
-
I manage a large website that hosts 1000's of business listings that comprise an area that covers 7 state counties. Currently a category page (such as lodging) hosts a group of listings which then link to it's own page. From these pages links are present directly to the business it represents. The client is proposing that we change all listings to link to the representative county website and remove the individual pages. This essentially would create 1000's of external links to 7 different websites and remove 1000's of pages from our site.
Does anyone have thoughts on how adding 1000's of links (potentially upwards of 3000) to only 7 websites (that I would deem relevant links) would affect SEO? I know if 1000's of links are added pointing to 1000's of websites the site can be considered a link farm, but I can't find any info online that speaks of a case like this. -
do you have any evidence that linking out can improve domain authority, I don't think it can.,
Matt cuts once said that it can be beneficial to link out, well of cause it can, but can it make you rank higher?
The evidence shows it can make you rank lower, not higher
-
Thanks for all the info. We have has a solid SEO strategy to date and currently the site ranks VERY well for all of it's identified keywords. There is a well thought out site architecture and internal linking strategy currently. I know that generally adding external links can improve authority over time if they are relevant, authoritative sites, and done in moderation. To me, the biggest concern is that we are going from linking to the actual businesses from individual pages to having more of an overall listing page that links to 7 other "directory" sites. Also, I don't know how Google will interpret a website that only links to 7 other websites (I should mention that we are already currently linking to those 7 - before this proposed change - in many places across the website). I have already mentioned to the client if we move forward, we will be implementing nofollows on the links.
-
Yes there is hard data, google released and patented there PageRank algorithm,
http://en.wikipedia.org/wiki/PageRank
This page is a simple explanation
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
A no-follow will not save any PageRank,, it will only stop it reaching the linked to page.
-
Adding more internal links so the linkjuice isn't diluted over 1 link would be like playing black hat SEO... I'm sure it will be seen as spam. A nofollow is enough. Still, a directory of only 7 sites without the inner pages is useless.
-
Is there any hard data to back that up? Just curious if there has been a study done over a ton of pages, links, etc.
-
Yes it would. When you link out you lose PageRank.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerankTo minimize the lose of PR, you can add more links to your own site on the same page.
If you have a page and you have 3 internal links, and 1 external link. you are giving away 25% of your PR. but if you have 99 internal links and 1 External links you are only giving away 1%.
You are also losing content, and depending on your internal linking structure, you are more than likely going to lower the PR of your home page by removing sub-pages, again I refer to http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Using no-follow will not help you, all link lose link juice.
There are way of using JavaScript to do this, but one day that may come back to bite you. -
Couple of questions:
The Website is a directory and yet it points to only 7 outbound Websites?
What about using nofollow for all those links?
On the content side, you are about to loose much of the site's content, you should expect a massive traffic drop. What's the point of a Directory if it only links to 7 Websites without offering any extra valuable content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our parent company has included their sitemap links in our robots.txt file - will that have an impact on the way our site is crawled?
Our parent company has included their sitemap links in our robots.txt file. All of their sitemap links are on a different domain and I'm wondering if this will have any impact on our searchability or potential rankings.
Intermediate & Advanced SEO | | tsmith1310 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
Do you get links from new websites?
There's a new industry specific website that looks decent. It's clean and nothing spammy. However, it's so new it's DA is under 10. Is it worth pursuing a link from a site like this? On one hand, there's nothing spammy and it is industry specific. On the other...it's just DA is so terrible (worse than any of our other links), I don't want it to hurt us. Any thoughts? Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup1 -
What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too. Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords. Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/ Thanks for the help guys. I'm not really sure what to do.
Intermediate & Advanced SEO | | aronwp0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Toxic Links; Their Existence and Their Impact..
We are constantly being asked about the existence of “toxic Links” and that they are damaging the sites of our clients. Apparently, this definition is being pushed down the throats of clients by other “Seo experts” trying to hijack our business. At this point in time, clients can easily be swayed as a reflex reaction to a drop in rankings. These so called “Seo experts” are clearly scaremongering for their own gain but I would be grateful for your opinion about whether automated, spun content from Seolinkvine and the like, where the English may not be perfect (I assume this is what is meant by “toxic Links”) can actually damage a client’s site. Is it not more constructive to concentrate resources on dilution of keywords from the anchor text rather than waste time on links that may no longer be as powerful, or do they actually have a negative effect?
Intermediate & Advanced SEO | | Dexter-2455780 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0