Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will I lose Link Juice when implementing a Reverse Proxy?
-
My company is looking at consolidating 5 websites that it has running on magento, wordpress, drupal and a few other platforms on to the same domain. Currently they're all on subdomains but we'd like to consolidate the subdomains to folders for UX and SEO potential.
Currently they look like this:
After the reverse proxy they'll look like this:
I'm curious to know how much link juice will be lost in this switch. I've read a lot about site migration (especially the Moz example). A lot of these guides/case studies just mention using a bunch of 301's but it seems they'd probably be using reveres proxies as well.
My questions are:
- Is a reverse proxy equal to or worse/better than a 301?
- Should I combine reverse proxy with a 301 or rel canonical tag?
- When implementing a reverse proxy will I lose link juice = ranking?
Thanks so much!
Jacob
-
Two servers? The existing one to process the redirects and a new one to handle the reverse proxy. Or vice versa. So the DNS for the old domain would point to a server that does the redirects. However, the server that hosts the site will be set to reverse proxy.
Another way of looking at this concept would be to take down and redirect the old site, and to start a new site, with the exact same files/database, that will be used to serve the content in the subdirectory/folder.
Ask what they think of that idea. I certainly don't have all the answers to every situations, but will do my best to help you find a workable solution.
-
Hey Everett,
My dev team says it's extremely difficult to do a 301 with the reverse proxy because the reverse proxy needs the domain in order to create the reverse proxy. If we place a 301 redirect it won't be able to access the domain and will be broken.
We're unable to do a server to server process because we're using load balance applications. Do you have any recommendations with this situation?
Thanks,
Jacob
-
That's good to know. I'll follow that.
Some of the articles I read made it sound like reverse proxy was another form of 301, but it didn't make sense. Now I know why.
Cheers,
Jacob
-
You need to do both.
The reverse proxy isn't about SEO so much as the ability to use subdirectories instead of subdomains, even though the subdirectories are hosted on different servers. You have to use a reverse proxy for the technological requirements of using a subdirectory (folder) instead of a subdomain when you're hosting sites on different servers but want to combine them on the same domain.
The 301 Redirects will ensure the users (including search engines) that visit the old URL on the subdomain get forwarded on to the new URL in the subdirectory. This is what preserves pagerank, and provides a good user experience. Do not keep the content available at the old URLs; Do not allow those URLs to return a 404 Status. 301 Redirect them.
I hope that clarifies your situation.
-
My dev team would prefer a reverse proxy. Is there a reason you'd rather do a 301 than a reverse proxy?
When reading this article https://moz.com/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo it seemed that doing a reverse proxy would be preferable to just a bunch of 301's. Is that not the case?
-
It is getting over complicated. First principal is do not harm. I would not recommend a reverse proxy. 301'ing each page carry's the juice over. A 301 does everything you need.
However prior to any 301's I would be auditing each sub domain for a penalty - ie you could be pushing a penalty to the main site. So i would suggest a very thorough audit. If in doubt rel canonical that page.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
How To Implement Breadcrumbs
Hi, I'm looking to implement breadcrumbs for e-commerce store so they will appear in the SERP results like the attached image. In terms of implementing to a site, would you simply add HTML to each page like this Google example? Which looks like this: Books › Science Fiction Award Winners Then is there anything you need to do, to get this showing in the SERPs results e.g. doing something in search console. Or do you just wait into google has crawled and hopefully starts showing in the SERPs results? Cheers. wn3ybMMOQFW98fNQkxtJkA.png [SERP results with bread crumbs](SERP results with bread crumbs)
Intermediate & Advanced SEO | | jaynamarino0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Do 404s really 'lose' link juice?
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time. Am I looking at it wrong? ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
Intermediate & Advanced SEO | | friendoffood0 -
How to tell the date a link was created
Does anybody know of a website that can let you know when an external link was created to a site? Or any other way of finding this info out. Thanks
Intermediate & Advanced SEO | | RobSchofield0 -
Maximum number of links
Hi there, I have just written an article that is due to be posted on an external blog, the article has potentially 3 links that could link to 3 different pages on my website, is this too much? what do you recommend being the maximum number of links? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
Outbound Links to Authority sites
Will outbound links to a related topic on an authority site help, hurt or be irrelevanent for SEO purposes. And if beneficially, should it be Nofollow?
Intermediate & Advanced SEO | | VictorVC0 -
Canonical Tag and Affiliate Links
Hi! I am not very familiar with the canonical tag. The thing is that we are getting traffic and links from affiliates. The affiliates links add something like this to the code of our URL: www.mydomain.com/category/product-page?afl=XXXXXX At this moment we have almost 2,000 pages indexed with that code at the end of the URL. So they are all duplicated. My other concern is that I don't know if those affilate links are giving us some link juice or not. I mean, if an original product page has 30 links and the affiliates copies have 15 more... are all those links being counted together by Google? Or are we losing all the juice from the affiliates? Can I fix all this with the canonical tag? Thanks!
Intermediate & Advanced SEO | | jorgediaz0