Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Server is taking too long to respond - What does this mean?
-
A client has 3 sites that he would like for me to look at. Whenever I attempt to on my home internet I get this message:
The connection has timed out
The server is taking too long to respond.When I take my iphone off wifi and use AT&T, the site comes up fine. What is going on here?
-
More than likely it was one 3 things, a DNS issue, a peering issue, or a temp ban.
If you were currently ftp'ing into the site and had too many threads open, usually above 4 or 5 but all depends on the server setting. They can issue a temporary ban on your ip address for the site. Depending on how the server is set up, you can either get an explicit message, which is bad. Or you can just get an error like you, which is good and it means the server is shedding the load.
A DNS issue could be that a name server is down somewhere or having other problems. You generally cannot do anything about this and they are generally fixed quickly because of the amount of sites / information hosted on them is vital.
A peering problem, like a DNS issue is usually spotty. More than likely that is what was happening. A peering issue means you cannot access the "chunk" of internet that the peer directs traffic through. So say you can access 99.9% of everything you want, because it is not going through the peer with the issues.
The best tools you can use to diagnose these problems are TOR, it is a socks proxy that routes your traffic so essentially you will be accessing the site through another isp, who could not be having peering or DNS issues with the hosting isp. Also you can use http://www.whatsmydns.net/ which will let you know what different dns servers around the world are returning. It will let you know if a major DNS server is having an issue. For general checking you can use this as well, http://www.downforeveryoneorjustme.com/
-
Check with the IT folks or hosting service for your client. I think this is an outside chance, but if you have been running spiders from your home computer to check the site, you may have been hitting it too hard and slowed the site down and the server may be blocking your IP as you are seen as a spammer. That is why you change ISPs you are golden as you are seen as a different "user".
I took down one of our sites once with a spidering tool. They were pushing new code right when I hit the site. Also, the number of requests a second I thought were ok, well, it was during peak traffic time. (DOH!)
I adjusted my crawl rate down and everything was ok. Again, this is just a guess, but worth checking considering your symptoms.
Good luck!
-
Yeah they all work for me too.
So this remains one of the weirder topics on here but for different reasons than I first suspected. ..I'm really not sure what to tell you. Sorry.
-
They all work for me the topsmagic site takes a while to load though
-
-
that's weird. what are the domains let's see if I can access them?
-
Wait are you saying this is just for your clients' sites? You can access other sites just fine? That's how you posted this question?
Sorry i'm confused.
-
My internet is working fine. I'm on moz.org right now using my internet. It's only when I attempt to visit those 3 websites.
-
Your internet and/or router is down..? Yeah I'd power-cycle the router and modem and try again. Or contact your cable company.
No offense but this is one of the weirdest Q&A posts I've seen here. I'm having a weird morning though so it totally fits.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
Do long UTM codes hurt SEO?
Since most UTM codes/URLs are longer than 70ish characters, is this hurting my SEO? If it is, how can I solve the problem while still using a UTM code? Thanks!
Technical SEO | | Cassie_Ransom0 -
"5XX (Server Error)" - How can I fix this?
Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Technical SEO | | RocketStats0 -
How Long To Recover Rankings After Multi-Day Site Outage?
Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks
Technical SEO | | smaavie0 -
Rel="Follow"? What the &#@? does that mean?
I've written a guest blog post for a site. In the link back to my site they've put a rel="follow" attribute. Is that valid HTML? I've Googled it but the answers are inconclusive, to say the least.
Technical SEO | | Jeepster0 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
How long should I keep 301 redirects?
I have modified a the URL structure of a whole section of a website and used mod_rewrite 301 redirect to match the new structure. Now that was around 3 months ago and I was wondering how long should I keep this redirect for? As it is a new website I am quite sure that there are no links around with the old URL structure but still I can see the google bot trying from time to time to access the old URL structure. Shouldn't the google bot learn from this 301 redirect and not go anymore for the old URL?
Technical SEO | | socialtowards0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0