Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will I lose Link Juice when implementing a Reverse Proxy?
-
My company is looking at consolidating 5 websites that it has running on magento, wordpress, drupal and a few other platforms on to the same domain. Currently they're all on subdomains but we'd like to consolidate the subdomains to folders for UX and SEO potential.
Currently they look like this:
After the reverse proxy they'll look like this:
I'm curious to know how much link juice will be lost in this switch. I've read a lot about site migration (especially the Moz example). A lot of these guides/case studies just mention using a bunch of 301's but it seems they'd probably be using reveres proxies as well.
My questions are:
- Is a reverse proxy equal to or worse/better than a 301?
- Should I combine reverse proxy with a 301 or rel canonical tag?
- When implementing a reverse proxy will I lose link juice = ranking?
Thanks so much!
Jacob
-
Two servers? The existing one to process the redirects and a new one to handle the reverse proxy. Or vice versa. So the DNS for the old domain would point to a server that does the redirects. However, the server that hosts the site will be set to reverse proxy.
Another way of looking at this concept would be to take down and redirect the old site, and to start a new site, with the exact same files/database, that will be used to serve the content in the subdirectory/folder.
Ask what they think of that idea. I certainly don't have all the answers to every situations, but will do my best to help you find a workable solution.
-
Hey Everett,
My dev team says it's extremely difficult to do a 301 with the reverse proxy because the reverse proxy needs the domain in order to create the reverse proxy. If we place a 301 redirect it won't be able to access the domain and will be broken.
We're unable to do a server to server process because we're using load balance applications. Do you have any recommendations with this situation?
Thanks,
Jacob
-
That's good to know. I'll follow that.
Some of the articles I read made it sound like reverse proxy was another form of 301, but it didn't make sense. Now I know why.
Cheers,
Jacob
-
You need to do both.
The reverse proxy isn't about SEO so much as the ability to use subdirectories instead of subdomains, even though the subdirectories are hosted on different servers. You have to use a reverse proxy for the technological requirements of using a subdirectory (folder) instead of a subdomain when you're hosting sites on different servers but want to combine them on the same domain.
The 301 Redirects will ensure the users (including search engines) that visit the old URL on the subdomain get forwarded on to the new URL in the subdirectory. This is what preserves pagerank, and provides a good user experience. Do not keep the content available at the old URLs; Do not allow those URLs to return a 404 Status. 301 Redirect them.
I hope that clarifies your situation.
-
My dev team would prefer a reverse proxy. Is there a reason you'd rather do a 301 than a reverse proxy?
When reading this article https://moz.com/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo it seemed that doing a reverse proxy would be preferable to just a bunch of 301's. Is that not the case?
-
It is getting over complicated. First principal is do not harm. I would not recommend a reverse proxy. 301'ing each page carry's the juice over. A 301 does everything you need.
However prior to any 301's I would be auditing each sub domain for a penalty - ie you could be pushing a penalty to the main site. So i would suggest a very thorough audit. If in doubt rel canonical that page.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Passing link juice via javascript?
Hello Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a> <a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a> <a href...="" ).<="" p="">Regards,</a>
Intermediate & Advanced SEO | | PenaltyHammer0 -
Do I have to many internal links which is diluting link juice to less important pages
Hello Mozzers, I was looking at my homepage and subsequent category landing pages on my on my eCommerce site and wondered whether I have to many internal links which could in effect be diluting link juice to much of the pages I need it to flow. My homepage has 266 links of which 114 (43%) are duplicate links which seems a bit to much to me. One of my major competitors who is a national company has just launched a new site design and they are only showing popular categories on their home page although all categories are accessible from the menu navigation. They only have 123 links on their home page. I am wondering whether If I was to not show every category on my homepage as some of them we don't really have any sales from and only concerntrate on popular ones there like my competitors , then the link juice flowing downwards in the site would be concerntated as I would have less links for them to flow ?... Is that basically how it works ? Is there any negatives with regards to duplicate links on either home or category landing page. We are showing both the categories as visual boxes to select and they are also as selectable links on the left of a page ? Just wondered how duplicate links would be treated? Any thoughts greatly appreciated thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Sponsored blog - pass any link juice?
Hello there, If a quality blog in our specific niche writes an article about us which is clearly labelled "sponsored post" as we have either paid them or given them a product, will Google discount that link going back to our website? Should we request for the link to be "no-follow"? Thanks Robert
Intermediate & Advanced SEO | | roberthseo0 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
100 + links on a scrolling page
Can you add more than 100 links on your webpage If you have a webpage that adds more content from a database as a visitor scrolls down the page. If you look at the page source the 100 + links do not show up, only the first 20 links. As you scroll down it adds more content and links to the bottom of the page so its a continuos flowing page if you keep scrolling down. Just wanted to know how the 100 links maximum fits into this scenario ?
Intermediate & Advanced SEO | | jlane90 -
Where to link to HTML Sitemap?
After searching this morning and finding unclear answers I decided to ask my SEOmoz friends a few questions. Should you have an HTML sitemap? If so, where should you link to the HTML sitemap from? Should you use a noindex, follow tag? Thank you
Intermediate & Advanced SEO | | cprodigy290