Using a Colo Load Balancer to serve content
-
So this is a little complicated (at least for me...)
We have a client who is having us rebuild and optimize about 350 pages of their website in our CMS. However, the rest of the website will not be on our CMS. We wanted to build these pages on a sub-domain that is pointed to our IPs so it could remain on our CMS--which the client wants. However, they want the content on a sub-directory. This would be fine but they will not point the main domain to us and for whatever reason this becomes impossible per their Dev team.
They have proposed using a Colo Load Balancer to deliver the content from our system (which will be on the sub domain) to their sub directory.
This seems very sketchy to me. Possible duplicate content? Would this be a sort of URL masking? How would Google see this? Has anyone ever even heard of doing anything like this?
-
Hello Billy,
As you're probably aware, load balancing services are for distributing traffic to more than one server in order to maintain high performance even when traffic levels spike. There is nothing wrong with this from an SEO perspective, as it all happens server-side before the user agent (e.g. Google) ever receives anything. It is a common practice amongst enterprise-level websites.
However, you are right to be concerned about this implementation, as it is definitely not the intended use of the technology, and sounds like a workaround instead of an actual fix. It may be a good workaround if you only allow one version of the content to be indexed, and ensure proper use of cross-domain rel canonical tags. Or you could even simply block anyone, including Google, from accessing the non-canonical version (on your subdomain, I take it) by returning a 401 (unauthorized) or a 403 (forbidden) status code.
-
They're right in that you do NOT want the content to be on a different subdomain--in most cases, Google doesn't share domain authority across subdomains.
You can do a reverse proxy to handle this--see Jeremy's writeup here.
Load-balancing is a fairly generic term. I'm really familiar only with BigIP F5 hardware load balancing and Microsoft's software-based load balancing, but it's possible that some load balancing solutions can handle things like the reverse proxy would.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues - Where to start???
Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt
Technical SEO | | MattByrne0 -
Content too buried in source code?
Our team is working on a refresh/redesign and am wondering if there's a quantifiable way of determining how high our meta data, H1 and paragraph should be in the source code. Or even whether I should be concerned with that. Our navigation will likely have dozens of links (we're going to keep it to under 100), and this doesn't even factor in the design elements. I am concerned about the content being buried. Are these the kind of concerns I should be having? Is there a measurable way to avoid it?
Technical SEO | | SSFCU0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
How to get rid of duplicate content
I have duplicate content that looks like http://deceptionbytes.com/component/mailto/?tmpl=component&link=932fea0640143bf08fe157d3570792a56dcc1284 - however I have 50 of these all with different numbers on the end. Does this affect the search engine optimization and how can I disallow this in my robots.txt file?
Technical SEO | | Mishelm1 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0