Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
-
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up.
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
-
Agreed and thanks, unfortunately, the host provider is anything but a one man op. Its huge. Moving to a tier four farm in Nov/Dec. Major company, in house phone, email, chat support. Etc.
As to Sha......I don't care if her answer came from martians, it was one of the best I have seen. (Note to moz staff......Hint, Hint)
-
Nah...the cool stuff is courtesy of my Boss whose brain can be kinda scary at times - I'm just soaking up the awesomeness he spreads around
We have this little reciprocal thing that is improving us both (although I don't think he's ever going to hunger for SEO the way I do! But then, that would make him kinda nuts! hehe)
(since you said "non-server side guy" I'm thinking that I probably should have mentioned that you can basically think of each IP being related to a card similar to a network card in your computer)
That whole owning and renting story is pretty common in that world, but is only a problem if you don't strike someone who knows what they are talking about.
We run our own client servers and I have to admit that I shudder when a client comes to us with an existing account from a couple of specific companies. 8(
No probs, always welcome.
-
@Sha, wow! What an exceptionally thorough and all-around awesome reply!
@Robert, you may have come to this conclusion on your own but perhaps it's time to consider a new host. You mentioned "they do not have the servers they just sell the service". I would definitely recommend purchasing service directly from a host and not from a middleman. A true host will often have their own data center and 100+ employees while middleman can sometimes be a 1-man or otherwise small shop. Their knowledge and support can be quite sketchy.
-
Ok, Now I am annoyed..... Journalist, web dev, writer, good grammar and spelling, and now this....Server Side Pro...... You are good.
This really does seem to make sense to a non server side type guy. I will follow up before we change to another farm. Just found out recently they do not have the servers they just sell the service. Thanks again Sha
-
Hi Robert,
I think I've picked up on all of the questions here (there's a lot going on!) and have borrowed some awesomeness from my Tech Wizard (Boss) to fill in the exciting bits, so here goes:
I'll start with the easy one first... well actually, none of them are that hard
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
OK, basically the answer to this one would be that your client's site was being throttled back by the host because it was using more bandwidth than was allowed under their existing plan. By moving them to the next plan (the extra $10 per month) the problem is resolved and the site speed returns to normal. Throttling it back gets the client to call... 8(
OK, 1 down and 2 to go...
About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
and also
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it.
OK, you know already that there can be up to 8 IPs on a box and at times something in the network will go bad. There are some variables here as to what is wrong. If you are on a Class C Network and one IP goes down then it means that the Switch or Router has gone bad (whether it is a Switch or a Router is determined by how the host has their hardware set up). If you are on a Class D Network and one IP goes down, then the problem is one of 3 things, the Card, the port, or the cable connecting the two, related to that IP.
The trick is that the person on the phone needs to realise what they are dealing with and escalate it to get the hardware issue resolved (A recent interaction with that particular host for one of our clients indicated to me that the realization part might be a little hit and miss, so good to have an understanding of what might be happening if it happens again)
Phew! Nearly there, last of all...
**On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. **
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
OK this one is all about DNS caching. That particular host (the one that likes lady racing drivers) has a fail-over system in place. This means that if an IP goes down, the domains on that IP will automatically fail-over to another box.
So, if you have looked at those domains on your machine, it will be cached. When you go back to check the site you are still looking at the cached version. The other people in the building are coming to the domain fresh and through a different ISP, so they see those domains because they are back up on the new box.
When the host reps were telling you that it was your ISP, what they really meant was that it had failed-over to a new box and you were still seeing the cached DNS location.
OK, think I covered it all so....that's all Folks!
Have a great holiday weekend!
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Handling Multiple Restaurants Under One Domain
We are working with a client that has 2 different restaurants. One has been established since 1938, the other was opened in late 2012. Currently, each site has its own domain name. From a marketing/branding perspective, we would like to make the customers [web visitors] of the established restaurant aware of the sister restaurant. To accomplish this, we are thinking about creating a landing page that links to each restaurant. To do this, we would need to purchase a brand new URL, and then place each restaurant in a separate sub folder of the new URL. The other thought is to have each site accessed from the main new URL [within sub folders] and also point each existing URL to the appropriate sub folder for each restaurant. We know there are some branding and marketing hurdles with this approach that we need to think through/work out. But, we are not sure how this would impact their SEO––and assume it will not be good. Any thoughts on this topic would be greatly appreciated.
Technical SEO | | thinkcreativegroup0 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
How much will changing IP addresses impact SEO?
So my company is upgrading its Internet bandwidth. However, apparently the vendor has said that part of the upgrade will involve changing our IP address. I've found two links that indicate some care needs to be taken to make sure our SEO isn't harmed: http://followmattcutts.com/2011/07/21/protect-your-seo-when-changing-ip-address-and-server/ http://www.v7n.com/forums/google-forum/275513-changing-ip-affect-seo.html Assuming we don't use an IP address that has been blacklisted by Google for spamming or other black hat tactics, how problematic is it? (Note: The site hasn't really been aggressively optimized yet - I started with the company less than two weeks ago, and just barely got FTP and CMS access yesterday - so honestly I'm not too worried about really messing up the site's optimization, since there isn't a lot to really break.)
Technical SEO | | ufmedia0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0