Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
-
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up.
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
-
Agreed and thanks, unfortunately, the host provider is anything but a one man op. Its huge. Moving to a tier four farm in Nov/Dec. Major company, in house phone, email, chat support. Etc.
As to Sha......I don't care if her answer came from martians, it was one of the best I have seen. (Note to moz staff......Hint, Hint)
-
Nah...the cool stuff is courtesy of my Boss whose brain can be kinda scary at times - I'm just soaking up the awesomeness he spreads around
We have this little reciprocal thing that is improving us both (although I don't think he's ever going to hunger for SEO the way I do! But then, that would make him kinda nuts! hehe)
(since you said "non-server side guy" I'm thinking that I probably should have mentioned that you can basically think of each IP being related to a card similar to a network card in your computer)
That whole owning and renting story is pretty common in that world, but is only a problem if you don't strike someone who knows what they are talking about.
We run our own client servers and I have to admit that I shudder when a client comes to us with an existing account from a couple of specific companies. 8(
No probs, always welcome.
-
@Sha, wow! What an exceptionally thorough and all-around awesome reply!
@Robert, you may have come to this conclusion on your own but perhaps it's time to consider a new host. You mentioned "they do not have the servers they just sell the service". I would definitely recommend purchasing service directly from a host and not from a middleman. A true host will often have their own data center and 100+ employees while middleman can sometimes be a 1-man or otherwise small shop. Their knowledge and support can be quite sketchy.
-
Ok, Now I am annoyed..... Journalist, web dev, writer, good grammar and spelling, and now this....Server Side Pro...... You are good.
This really does seem to make sense to a non server side type guy. I will follow up before we change to another farm. Just found out recently they do not have the servers they just sell the service. Thanks again Sha
-
Hi Robert,
I think I've picked up on all of the questions here (there's a lot going on!) and have borrowed some awesomeness from my Tech Wizard (Boss) to fill in the exciting bits, so here goes:
I'll start with the easy one first... well actually, none of them are that hard
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
OK, basically the answer to this one would be that your client's site was being throttled back by the host because it was using more bandwidth than was allowed under their existing plan. By moving them to the next plan (the extra $10 per month) the problem is resolved and the site speed returns to normal. Throttling it back gets the client to call... 8(
OK, 1 down and 2 to go...
About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
and also
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it.
OK, you know already that there can be up to 8 IPs on a box and at times something in the network will go bad. There are some variables here as to what is wrong. If you are on a Class C Network and one IP goes down then it means that the Switch or Router has gone bad (whether it is a Switch or a Router is determined by how the host has their hardware set up). If you are on a Class D Network and one IP goes down, then the problem is one of 3 things, the Card, the port, or the cable connecting the two, related to that IP.
The trick is that the person on the phone needs to realise what they are dealing with and escalate it to get the hardware issue resolved (A recent interaction with that particular host for one of our clients indicated to me that the realization part might be a little hit and miss, so good to have an understanding of what might be happening if it happens again)
Phew! Nearly there, last of all...
**On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. **
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
OK this one is all about DNS caching. That particular host (the one that likes lady racing drivers) has a fail-over system in place. This means that if an IP goes down, the domains on that IP will automatically fail-over to another box.
So, if you have looked at those domains on your machine, it will be cached. When you go back to check the site you are still looking at the cached version. The other people in the building are coming to the domain fresh and through a different ISP, so they see those domains because they are back up on the new box.
When the host reps were telling you that it was your ISP, what they really meant was that it had failed-over to a new box and you were still seeing the cached DNS location.
OK, think I covered it all so....that's all Folks!
Have a great holiday weekend!
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Do I use /es/, /mx/ or /es-mx/ for my Spanish site for Mexico only
I currently have the Spanish version of my site under myurl.com/es/ When I was at Pubcon in Vegas last year a panel reviewed my site and said the Spanish version should be in /mx/ rather than /es/ since es is for Spain only and my site is for Mexico only. Today while trying to find information on the web I found /es-mx/ as a possibility. I am changing my site and was planning to change to /mx/ but want confirmation on the correct way to do this. Does anyone have a link to Google documentation that will tell me for sure what to use here? The documentation I read led me to the /es/ but I cannot find that now.
Technical SEO | | RoxBrock0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Multiple urls for posting multiple classified ads
Want to optimize referral traffic while at same time keep search engines happy and the ads posted. Have a client who advertises on several classified ad sites around the globe. Which is better (post Panda), having multiple identical urls using canonicals to redirect juice to original url? For example: www.bluewidgets.com is the original www.bluewidgetsusa.com www.blue-widgets-galore.com Or, should the duplicate pages be directed to original using a 301? Currently using duplicate urls. Am currently not using "nofollow" tags on those pages.
Technical SEO | | AllIsWell0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Redirecting blog.<mydomain>.com to www.<mydomain>.com\blog</mydomain></mydomain>
This is more of a technical question than pure SEO per se, but I am guessing that some folks here may have covered this and so I would appreciate any questions. I am moving from a WordPress.com-based blog (hosted on WordPress) to a WordPress installation on my own server (as suggested by folks in another thread here). As part of this I want to move from the format blog.<mydomain>.com to www.mydomain.com\blog. I have installed WordPress on my server and have imported posts from the hosted site to my own server. How should I manage the transition from first format to the second? I have a bunch of links on Facebook, etc that refer to URLs of the blog..com format so it's important that I redirect.</mydomain> I am running DotNetNuke/WordPress on my own IIS/ASP.Net servers. Thanks. Mark
Technical SEO | | MarkWill0 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1