Using a Colo Load Balancer to serve content
-
So this is a little complicated (at least for me...)
We have a client who is having us rebuild and optimize about 350 pages of their website in our CMS. However, the rest of the website will not be on our CMS. We wanted to build these pages on a sub-domain that is pointed to our IPs so it could remain on our CMS--which the client wants. However, they want the content on a sub-directory. This would be fine but they will not point the main domain to us and for whatever reason this becomes impossible per their Dev team.
They have proposed using a Colo Load Balancer to deliver the content from our system (which will be on the sub domain) to their sub directory.
This seems very sketchy to me. Possible duplicate content? Would this be a sort of URL masking? How would Google see this? Has anyone ever even heard of doing anything like this?
-
Hello Billy,
As you're probably aware, load balancing services are for distributing traffic to more than one server in order to maintain high performance even when traffic levels spike. There is nothing wrong with this from an SEO perspective, as it all happens server-side before the user agent (e.g. Google) ever receives anything. It is a common practice amongst enterprise-level websites.
However, you are right to be concerned about this implementation, as it is definitely not the intended use of the technology, and sounds like a workaround instead of an actual fix. It may be a good workaround if you only allow one version of the content to be indexed, and ensure proper use of cross-domain rel canonical tags. Or you could even simply block anyone, including Google, from accessing the non-canonical version (on your subdomain, I take it) by returning a 401 (unauthorized) or a 403 (forbidden) status code.
-
They're right in that you do NOT want the content to be on a different subdomain--in most cases, Google doesn't share domain authority across subdomains.
You can do a reverse proxy to handle this--see Jeremy's writeup here.
Load-balancing is a fairly generic term. I'm really familiar only with BigIP F5 hardware load balancing and Microsoft's software-based load balancing, but it's possible that some load balancing solutions can handle things like the reverse proxy would.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Is it appropriate to use canonical for a yearly post with similar content?
I've begun writing an annual review of local business directories. Post from 2012 is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-in-2012/ New 2014 post is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-2014/ Is this appropriate use? Next year the post will be similar, but different metrics reported and slightly different review. Side note: For some reason the post hasn't been indexed by Google yet. Usually new posts are indexed as soon as they are shared on social media.
Technical SEO | | ServiceCrowd_AU0 -
Question About Using Disqus
I'm thinking about implementing Disqus on my blog. I'd like to know if the Disqus comments are indexed by search engines? It looks like they are displayed using Ajax or jQuery.
Technical SEO | | sbrault740 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Techniques for diagnosing duplicate content
Buonjourno from Wetherby UK 🙂 Diagnosing duplicate content is a classic SEO skill but I'm curious to know what techniques other people use. Personally i use webmaster tools as illustrated here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/webmaster-tools-duplicate.jpg but what other techniques are effective? Thanks,
Technical SEO | | Nightwing
David0 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
Is it ok to just use the end of the url when using a Rel Cononical Link?
Hi, I am working with an account and the previous SEO used a Rel Canonical link that just uses the end of the url. Instead of the full url When I look it up on the web I see most people are using the full url. Is that the proper way to do it or does is suffice to just use the end of the url? Wanted to check before I take the time to change them all. -Kent
Technical SEO | | KentH0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0