Will I lose Link Juice when implementing a Reverse Proxy?
-
My company is looking at consolidating 5 websites that it has running on magento, wordpress, drupal and a few other platforms on to the same domain. Currently they're all on subdomains but we'd like to consolidate the subdomains to folders for UX and SEO potential.
Currently they look like this:
After the reverse proxy they'll look like this:
I'm curious to know how much link juice will be lost in this switch. I've read a lot about site migration (especially the Moz example). A lot of these guides/case studies just mention using a bunch of 301's but it seems they'd probably be using reveres proxies as well.
My questions are:
- Is a reverse proxy equal to or worse/better than a 301?
- Should I combine reverse proxy with a 301 or rel canonical tag?
- When implementing a reverse proxy will I lose link juice = ranking?
Thanks so much!
Jacob
-
Two servers? The existing one to process the redirects and a new one to handle the reverse proxy. Or vice versa. So the DNS for the old domain would point to a server that does the redirects. However, the server that hosts the site will be set to reverse proxy.
Another way of looking at this concept would be to take down and redirect the old site, and to start a new site, with the exact same files/database, that will be used to serve the content in the subdirectory/folder.
Ask what they think of that idea. I certainly don't have all the answers to every situations, but will do my best to help you find a workable solution.
-
Hey Everett,
My dev team says it's extremely difficult to do a 301 with the reverse proxy because the reverse proxy needs the domain in order to create the reverse proxy. If we place a 301 redirect it won't be able to access the domain and will be broken.
We're unable to do a server to server process because we're using load balance applications. Do you have any recommendations with this situation?
Thanks,
Jacob
-
That's good to know. I'll follow that.
Some of the articles I read made it sound like reverse proxy was another form of 301, but it didn't make sense. Now I know why.
Cheers,
Jacob
-
You need to do both.
The reverse proxy isn't about SEO so much as the ability to use subdirectories instead of subdomains, even though the subdirectories are hosted on different servers. You have to use a reverse proxy for the technological requirements of using a subdirectory (folder) instead of a subdomain when you're hosting sites on different servers but want to combine them on the same domain.
The 301 Redirects will ensure the users (including search engines) that visit the old URL on the subdomain get forwarded on to the new URL in the subdirectory. This is what preserves pagerank, and provides a good user experience. Do not keep the content available at the old URLs; Do not allow those URLs to return a 404 Status. 301 Redirect them.
I hope that clarifies your situation.
-
My dev team would prefer a reverse proxy. Is there a reason you'd rather do a 301 than a reverse proxy?
When reading this article https://moz.com/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo it seemed that doing a reverse proxy would be preferable to just a bunch of 301's. Is that not the case?
-
It is getting over complicated. First principal is do not harm. I would not recommend a reverse proxy. 301'ing each page carry's the juice over. A 301 does everything you need.
However prior to any 301's I would be auditing each sub domain for a penalty - ie you could be pushing a penalty to the main site. So i would suggest a very thorough audit. If in doubt rel canonical that page.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPs to HTTP Links
Hi Mozers, I have a question about the news that Google Chrome will start blocking mixed content starting in December 2019. That starting in December 2019, users that are presented insecure content will be presented a toggle allowing those Chrome users to unblock the insure resources that Chrome is blocking. And in January 2020, Google will remove that toggle option an will just start blocking mixed content or insecure web pages. Not sure what this means. What are the implications of this for a HTTPS page that has an HTTP link? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Do 404s really 'lose' link juice?
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time. Am I looking at it wrong? ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
Intermediate & Advanced SEO | | friendoffood0 -
Block Level Link Juice
I need a better understanding of how links in different parts of the page pass juice. Much has been written about how footer links pass less juice than other parts of the page. The question I have is that if a page has a hypothetical 1000 points of Link Juice and can pass on +/-800 points via links, and I have 1 and only 1 link in the footer to another page, does it pass the full 800 points? Or... since footers only pass a small fraction of link juice, it passes lets say 80 points, and the other 720 points stays locked up on the page. This question is a hypothetical - I'm just trying to understand relationships. I don't know if I've explained the question too well, but if someone could answer i it, or point me in the right direction, I would appreciate it.
Intermediate & Advanced SEO | | CsmBill0 -
Will links still show in WMT after you disavow them?
Does anyone know a definitive answer to this? I'm thinking they will still show up in WMT links to your site? Anyone seen anything different? Thanks,
Intermediate & Advanced SEO | | Further
Chris0 -
Do I have any harmful links? If so, what should I do?
URL in question: www.nasserilegal.com/criminal.html I'm using OSE and see some questionable backlinks. At first glance, if you look at the page authority and domain authority, they look great. Once you go to the actual pages, they look spammy. If the links are hurting the rankings for the site, should I try to remove the links manually or just ignore and continue to build good quality links or even build a new site? I noticed for the last couple of weeks, the rankings started to slip. Thanks in Advance, Lucas
Intermediate & Advanced SEO | | micasalucasa0 -
CMS generating thousands of links, will it hurt my SEO?
I've shifted my static (HTML) eCommerce website to Magento. I am facing serious problem, my website has total 20 products (each product has canonical URL) , I was surprised to see thousands of links indexed in Google as well as in my webmaster Crawler stats, later on I removed all from webmaster tool and marked as fixed, also blocked crawlers to crawl on those specific directories through robots.txt file. Now my question is will these urls still effect my website's SEO? As they still exist and accessible but blocked for crawlers. And is there any better way to block them other than robots.txt.Thanks
Intermediate & Advanced SEO | | clarybusinessmachines0 -
How to Implement Massive SEO Modifications
Hi everyone, I'm implementing some fairly significant changes on a clients website and wanted to know if it was better to implement all the changes at once or if I should implement the changes gradually. The changes are: 1. Amended information architecture 2. Completely new URL's 3. New meta data and some new on page content 4. Meta robots 'no index, follow' approximately 90% of the site Can I make all these changes in one go (that would be my preference), or should I gradually implement? What are the risks? Many thanks James
Intermediate & Advanced SEO | | jamesjackson1 -
Quantifying Linking Campaign Value
Is there any way to predict if and how Organic traffic would change if we sucesfully added some high-quality links to our website? Quantifying link value would help to plan how much time/efforts we should spend on quality link-building. I understand that the more good links we get - the better. But beyond that, I am looking for some methodology/data/formulas that would help to decide if links are worth pursing. Here is an example: Let's say we acquired 20 high-quality links from PR 0-5 pages of some trusted web sites of PR6-8. Let's say that on these pages would also link to 10-20 other web sites. Would such campaign be of some direct value to our ecommerce website of PR6? My question is limited to how high-quality links improve overall Google search traffic to the website only. I am not interested in calculating value of individual keywords - most of our search traffic comes from long tail. I am also not interested in how to estimate referral traffic - both seem much easier topics to tackle. But how would I be able to measure the value of lets say 1 link from PR 8 site with a PR3 page, when there are 10 other external links on that page?
Intermediate & Advanced SEO | | Quidsi0