How to best keep client hosting separate but manageable?
-
For those of you with a number of client accounts for which you do hosting, how do you keep them manageable but separate?
Let's assume you have both public and private clients and don't want someone to do a reverse IP/server lookup and be able to identify everyone you work with. Additionally clients can be working in the US/UK/EU and want localised hosting.
I'm looking for a large shared hosting provider (with some potentially dedicated options) who will let me manage accounts on multiple physical servers in a variety of geolocations from a single billing account and preferably a single admin panel as well.
Once client contracts end I also need the ability to let them take over the hosting in a break-away account and to be able to add their own billing details.
I'm looking for a solution a bit more upmarket than something like SEOhosting from Hostgator (which doesn't allow me to specify geolocation territories anyway), potentially with an account manager to help me sort out the individual requirements.
Does anybody have any ideas of providers or what I should be searching for to get what I want?
-
Why can't you just use a large vps account with whm / cpanel and give each an account. As for the different servers, just use different ip addresses an no one will ever know. I would look into a set up by either webhostingbuzz.com they have a couple data centers in the US and Europe or OVH if you know how to manage a server.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some bots excluded from crawling client's domain
Hi all! My client is in healthcare in the US and for HIPAA reasons, blocks traffic from most international sources. a. I don't think this is good for SEO b. The site won't allow Moz bot or Screaming Frog bot to crawl it. It's so frustrating. We can't figure out what mechanism they are utilizing to execute this. Any help as we start down the rabbit hole to remedy is much appreciated. thank you!
Technical SEO | | SimpleSearch0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Company blog. What are the best solutions?
Hello Moz Community! Our company has its own blog (www.awarablogs.com) - the blog was created some time ago by means of a simple blog-engine. Now we see that the structure of the blog is bad for SEO (it has long URLs, many useless folders, subdomains and so on), so we'd like to simplify it. But the engine doesn't allow to change its structure in the way we 'd like to. Our webmaster suggested that we use "Alias". Will this method really help us make our blog SEO-friendly? Or is it better to choose another blog software like Wordpress? Thank you very much!
Technical SEO | | Awaraman0 -
Changing Web Hosting
We're about to change web hosting providers and I'm wondering whether there's an "optimal" way to do this without losing any SEO value. Is it as simple as changing hosts without SEO in mind - point the domain to the new host?
Technical SEO | | b40040400 -
Know of a decent hosting service in France?
I'm looking for decent hosting services in France? Any recommendations? Thanks
Technical SEO | | Martin_S0 -
Prospective new client it by webspam looking for new resource
Background:
Technical SEO | | tcmktg
Prospective client recently hit by webspam update. (I have verified hundreds of low-quality links, porn links, backlink exchanges etc.) They want us to step in and remove bad links and start over. Question:
What is the best way to examine all the links to determine which need to be removed? We can create the report from open site, but how can we identify the bad links? Here are the site metrics. 5000+ linking domains, so in this example we need to research the 5000 links, and possibly send notifications to thousands of webmasters to remove the links? Open site states about 25,000 total links, but root links are shown below. Yikes. Domain Authority 75
External Followed Links 112,000
Total External Links 115,000
Total Links 150,000,
Followed Linking Root Domains 3,900
Total Linking Root Domains 5,300
Linking C Blocks 2,7000 -
What is the best way to fix legacy overly-nested URLs?
Hi everyone, Due to some really poor decisions I made back when I started my site several years ago, I'm lumbered with several hundred pages that have overly-nested URLs. For example: /theme-parks/uk-theme-parks/alton-towers/attractions/enterprise I'd prefer these to feature at most three layers of nesting, for example: /reviews/alton-towers/enterprise Is there a good approach for achieving this, or is it best just to accept the legacy URLs as an unfixable problem, and make sure that future content follows the new structure? I can easily knock together a script to update the aliases for the existing content, but I'm concerned about having hundreds of 301 redirects (could this be achieved with a single regular express in .htaccess, for example?). Any guidance appreciated. Thanks, Nick
Technical SEO | | ThemeParkTourist0 -
Best way to setup large site for multi language
Hello, I am setting up a new site which is going to be very large, over 250,000 products. Most of our customers are in the UK (45%), the rest are from various European countries and the USA. Unfortunately we only have a team of two people writing content for these pages in English. I would value some input on the best way to setup my website structure for ranking. Obviously the best would be individual country oriented domains I.e. domain.fr domain.de domain.co.uk . However we wouldnt have the time to create content for every page and most pages would contain the same content as the English domain. Would I get a penalty for this from google? The second choice is to follow the example of overstock.com and pull in information relating to each country I.e. currency and delivery time. this would be a lot easier but I am concerned that the lack of geo focus would effect my rankings. Does any one have any ideas?
Technical SEO | | DavidLenehan0