Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Internal links and URL shortners
-
Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages).
I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically.
Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks?
Any advice much appreciated, I just wanted to double check this.
-
Recommend reading this article about del.ic.ious getting blacklisted by Google. This is very recent. So, consider this news when using bit.ly internal links in blog posts. Even though the links are going to result pages, this could be problematic for an as yet undetermined period of time (basically bit.ly has two issues to deal with - migration away from their .us domain plus vulnerabilities that make it possible for trojans and scripting elements to be used). Of course, if you're not receiving alerts then the issue is reframed.
http://www.wordfence.com/blog/2014/10/a-malicious-del-icio-us/
As stated, "Delicious has changed hands several times over the years and recently was re-sold earlier this year to Science Inc. They also rebranded several years ago to delicious.com which is not blacklisted, but there are likely a large number of legacy .us links out there. [Edit: Thanks Kelson]"
-
Hi Paul,
Justin gave you some great pointers below. In terms of it being unnatural I would completely agree as you are essentially leaving your site and coming straight back. Keep your internal links internal. Both methods Justin mentions would work well - I too would prefer the enhanced link attribution method.
-
Thanks, Andy!
-
I would second Justin's recommendation of the attribution builder rather than using URL shorteners.
-Andy
-
I would not use bit.ly or any other shortener for internal links.
If you want to track internal links, then you should use Google's enhanced link attribution or URL builder. I prefer the link attribution tool over the URL builder, though. This approach is more natural for visitors to your site, and it allows you to reap the SEO benefits of good internal site linking.
If you want to track external links, I'd recommend Google's outbound link tracking, a url shortener, or both. For Wordpress sites, Joost de Valk has an old (but good) post on one way to do this with a plugin. But the idea can be replicated on other sites. Basically, you link internally to a directory that's behind robots.txt. Then 301 those links to your affiliate or shortened url.
Hope that helps!
-
Hi Matt, thanks so much for the reply. Its not me personally doing it, I am looking into the seo side of this/possible implications. I understand it is being used to track clicks. Whilst I agree that Google analytics might be better for this, would the bit.ly links be doing any harm to the website for seo? it seems fairly unnatural to have an internal link going to bit.ly to go straight to our site again, how would google see this on mass?
Much appreciate your advice and opinion.
-
I personally don't see the why you would want to do this other than if you are trying to track clicks in which case you would be far better using Google Analytics as you mentioned. All internal links on a website should be on that sites domain not on a redirected link. Is there any other reason for using bit.ly? If so maybe myself or one of the community could make so suggestions to a better alternative?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
Are URL shorteners building domain authority everytime someone uses a link from their service?
My understanding of domain authority is that the more links pointing to any page / resource on a domain, the greater the overall domain authority (and weight passed from outbound links on the domain) is. Because URL shorteners create links on their own domain that redirect to an off-domain page but link "to" an on-domain URL, are they gaining domain authority each time someone publishes a shortened link from their service? Or does Google penalize these sites specifically, or links that redirect in general? Or am I missing something else?
Intermediate & Advanced SEO | | Jay.Neely0