Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
-
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up.
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
-
Agreed and thanks, unfortunately, the host provider is anything but a one man op. Its huge. Moving to a tier four farm in Nov/Dec. Major company, in house phone, email, chat support. Etc.
As to Sha......I don't care if her answer came from martians, it was one of the best I have seen. (Note to moz staff......Hint, Hint)
-
Nah...the cool stuff is courtesy of my Boss whose brain can be kinda scary at times - I'm just soaking up the awesomeness he spreads around
We have this little reciprocal thing that is improving us both (although I don't think he's ever going to hunger for SEO the way I do! But then, that would make him kinda nuts! hehe)
(since you said "non-server side guy" I'm thinking that I probably should have mentioned that you can basically think of each IP being related to a card similar to a network card in your computer)
That whole owning and renting story is pretty common in that world, but is only a problem if you don't strike someone who knows what they are talking about.
We run our own client servers and I have to admit that I shudder when a client comes to us with an existing account from a couple of specific companies. 8(
No probs, always welcome.
-
@Sha, wow! What an exceptionally thorough and all-around awesome reply!
@Robert, you may have come to this conclusion on your own but perhaps it's time to consider a new host. You mentioned "they do not have the servers they just sell the service". I would definitely recommend purchasing service directly from a host and not from a middleman. A true host will often have their own data center and 100+ employees while middleman can sometimes be a 1-man or otherwise small shop. Their knowledge and support can be quite sketchy.
-
Ok, Now I am annoyed.....
Journalist, web dev, writer, good grammar and spelling, and now this....Server Side Pro...... You are good.
This really does seem to make sense to a non server side type guy. I will follow up before we change to another farm. Just found out recently they do not have the servers they just sell the service. Thanks again Sha
-
Hi Robert,
I think I've picked up on all of the questions here (there's a lot going on!) and have borrowed some awesomeness from my Tech Wizard (Boss) to fill in the exciting bits, so here goes:
I'll start with the easy one first... well actually, none of them are that hard
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
OK, basically the answer to this one would be that your client's site was being throttled back by the host because it was using more bandwidth than was allowed under their existing plan. By moving them to the next plan (the extra $10 per month) the problem is resolved and the site speed returns to normal. Throttling it back gets the client to call... 8(
OK, 1 down and 2 to go...
About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
and also
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it.
OK, you know already that there can be up to 8 IPs on a box and at times something in the network will go bad. There are some variables here as to what is wrong. If you are on a Class C Network and one IP goes down then it means that the Switch or Router has gone bad (whether it is a Switch or a Router is determined by how the host has their hardware set up). If you are on a Class D Network and one IP goes down, then the problem is one of 3 things, the Card, the port, or the cable connecting the two, related to that IP.
The trick is that the person on the phone needs to realise what they are dealing with and escalate it to get the hardware issue resolved (A recent interaction with that particular host for one of our clients indicated to me that the realization part might be a little hit and miss, so good to have an understanding of what might be happening if it happens again)
Phew! Nearly there, last of all...
**On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. **
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
OK this one is all about DNS caching. That particular host (the one that likes lady racing drivers) has a fail-over system in place. This means that if an IP goes down, the domains on that IP will automatically fail-over to another box.
So, if you have looked at those domains on your machine, it will be cached. When you go back to check the site you are still looking at the cached version. The other people in the building are coming to the domain fresh and through a different ISP, so they see those domains because they are back up on the new box.
When the host reps were telling you that it was your ISP, what they really meant was that it had failed-over to a new box and you were still seeing the cached DNS location.
OK, think I covered it all so....that's all Folks!
Have a great holiday weekend!
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Handling Multiple Restaurants Under One Domain
We are working with a client that has 2 different restaurants. One has been established since 1938, the other was opened in late 2012. Currently, each site has its own domain name. From a marketing/branding perspective, we would like to make the customers [web visitors] of the established restaurant aware of the sister restaurant. To accomplish this, we are thinking about creating a landing page that links to each restaurant. To do this, we would need to purchase a brand new URL, and then place each restaurant in a separate sub folder of the new URL. The other thought is to have each site accessed from the main new URL [within sub folders] and also point each existing URL to the appropriate sub folder for each restaurant. We know there are some branding and marketing hurdles with this approach that we need to think through/work out. But, we are not sure how this would impact their SEO––and assume it will not be good. Any thoughts on this topic would be greatly appreciated.
Technical SEO | | thinkcreativegroup0 -
/~username
Hello, The utility on this site that crawls your site and highlights what it sees as potential problems reported an issue with /~username access seeing it as duplicate content i.e. mydomain.com/file.htm is the same as mydomain.com~/username/file.htm so I went to my server hosts and they disabled it using mod_userdir but GWT now gives loads of 404 errors. Have I gone about this the wrong way or was it not really a problem in the first place or have I fixed something that wasn't broken and made things worse? Thanks, Ian
Technical SEO | | jwdl0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Websites on same c class IP address
If two websites are on the same c class IP address, what does it mean ? Does two websites belong to the same company ?
Technical SEO | | seoug_20050 -
Can hidden backlinks ever be ok?
Hi all, I'm very new to SEO and still learning a lot. Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be? From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true? The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas). Thanks in advance!
Technical SEO | | gemcomp1230 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1