Using a Colo Load Balancer to serve content
-
So this is a little complicated (at least for me...)
We have a client who is having us rebuild and optimize about 350 pages of their website in our CMS. However, the rest of the website will not be on our CMS. We wanted to build these pages on a sub-domain that is pointed to our IPs so it could remain on our CMS--which the client wants. However, they want the content on a sub-directory. This would be fine but they will not point the main domain to us and for whatever reason this becomes impossible per their Dev team.
They have proposed using a Colo Load Balancer to deliver the content from our system (which will be on the sub domain) to their sub directory.
This seems very sketchy to me. Possible duplicate content? Would this be a sort of URL masking? How would Google see this? Has anyone ever even heard of doing anything like this?
-
Hello Billy,
As you're probably aware, load balancing services are for distributing traffic to more than one server in order to maintain high performance even when traffic levels spike. There is nothing wrong with this from an SEO perspective, as it all happens server-side before the user agent (e.g. Google) ever receives anything. It is a common practice amongst enterprise-level websites.
However, you are right to be concerned about this implementation, as it is definitely not the intended use of the technology, and sounds like a workaround instead of an actual fix. It may be a good workaround if you only allow one version of the content to be indexed, and ensure proper use of cross-domain rel canonical tags. Or you could even simply block anyone, including Google, from accessing the non-canonical version (on your subdomain, I take it) by returning a 401 (unauthorized) or a 403 (forbidden) status code.
-
They're right in that you do NOT want the content to be on a different subdomain--in most cases, Google doesn't share domain authority across subdomains.
You can do a reverse proxy to handle this--see Jeremy's writeup here.
Load-balancing is a fairly generic term. I'm really familiar only with BigIP F5 hardware load balancing and Microsoft's software-based load balancing, but it's possible that some load balancing solutions can handle things like the reverse proxy would.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Excessive use of KeyWord?
Hey I have an Immigration website in South Africa
Technical SEO | | NikitaG
MigrationLawyers.co.za and the website used to be divided in to two categories:
1st part - South African Immigration
2nd part - United Kingdom Immigration Because of that we made all the pages include the word "South Africa" in the titles. eg.
...ers.co.za/work-permit-south-africa
...ers.co.za/spousal-visa-south-africa
...ers.co.za/retirement-permit-south-africa
...ers.co.za/permanent-residence-south-africa I'm sure you get the idea.
we since, removed the UK part of the website and now are left only with the SA part. Now my question is: Is it bad? will google see this as spammy, as I'm targeting "South Africa" in almost every link of the website. Should I stick to the structure for new pages, or try to avoid any more use of "South Africa". Perhaps I can change something as it currently stands? Kind Regards
Nikita0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
One landing page with lots of content or content hub?
Interested in getting some opinions on if it's better to build one great landing page with tons of content or build a good landing page and build more content (as blog posts?) and interlink them back to the landing/hub page? Thoughts and opinions? Chris
Technical SEO | | sanctuarymg0 -
MBG Tracker...how to use it?
So I am a new blogger that has been submitting guest blog posts to a number of different blogs. It was recommended that I use the MBG Tracker so I can track the back links. The problem is that I am totally lost on how to use this tool. As I said before I am new to this whole thing and I am not really sure what constitutes a "base link" and a "back link." In the author bylines we are linking to different pages within a larger website. If anyone can help me I would really appreciate it!
Technical SEO | | Stroll0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0