Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Server is taking too long to respond - What does this mean?
-
A client has 3 sites that he would like for me to look at. Whenever I attempt to on my home internet I get this message:
The connection has timed out
The server is taking too long to respond.When I take my iphone off wifi and use AT&T, the site comes up fine. What is going on here?
-
More than likely it was one 3 things, a DNS issue, a peering issue, or a temp ban.
If you were currently ftp'ing into the site and had too many threads open, usually above 4 or 5 but all depends on the server setting. They can issue a temporary ban on your ip address for the site. Depending on how the server is set up, you can either get an explicit message, which is bad. Or you can just get an error like you, which is good and it means the server is shedding the load.
A DNS issue could be that a name server is down somewhere or having other problems. You generally cannot do anything about this and they are generally fixed quickly because of the amount of sites / information hosted on them is vital.
A peering problem, like a DNS issue is usually spotty. More than likely that is what was happening. A peering issue means you cannot access the "chunk" of internet that the peer directs traffic through. So say you can access 99.9% of everything you want, because it is not going through the peer with the issues.
The best tools you can use to diagnose these problems are TOR, it is a socks proxy that routes your traffic so essentially you will be accessing the site through another isp, who could not be having peering or DNS issues with the hosting isp. Also you can use http://www.whatsmydns.net/ which will let you know what different dns servers around the world are returning. It will let you know if a major DNS server is having an issue. For general checking you can use this as well, http://www.downforeveryoneorjustme.com/
-
Check with the IT folks or hosting service for your client. I think this is an outside chance, but if you have been running spiders from your home computer to check the site, you may have been hitting it too hard and slowed the site down and the server may be blocking your IP as you are seen as a spammer. That is why you change ISPs you are golden as you are seen as a different "user".
I took down one of our sites once with a spidering tool. They were pushing new code right when I hit the site. Also, the number of requests a second I thought were ok, well, it was during peak traffic time. (DOH!)
I adjusted my crawl rate down and everything was ok. Again, this is just a guess, but worth checking considering your symptoms.
Good luck!
-
Yeah they all work for me too.
So this remains one of the weirder topics on here but for different reasons than I first suspected. ..I'm really not sure what to tell you. Sorry.
-
They all work for me
the topsmagic site takes a while to load though
-
-
that's weird. what are the domains let's see if I can access them?
-
Wait are you saying this is just for your clients' sites? You can access other sites just fine? That's how you posted this question?
Sorry i'm confused.
-
My internet is working fine. I'm on moz.org right now using my internet. It's only when I attempt to visit those 3 websites.
-
Your internet and/or router is down..? Yeah I'd power-cycle the router and modem and try again. Or contact your cable company.
No offense but this is one of the weirdest Q&A posts I've seen here. I'm having a weird morning though so it totally fits.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
Do long UTM codes hurt SEO?
Since most UTM codes/URLs are longer than 70ish characters, is this hurting my SEO? If it is, how can I solve the problem while still using a UTM code? Thanks!
Technical SEO | | Cassie_Ransom0 -
How long does Google takes to re-index title tags?
Hi, We have carried out changes in our website title tags. However, when I search for these pages on Google, I still see the old title tags in the search results. Is there any way to speed this process up? Thanks
Technical SEO | | Kilgray0 -
Server Connection Error when using Google Speed Test Insight and GTMetrix
Hi Guys, Recently got into the issue when testing load speed of my website (https://solvid.co.uk). Occasionally, Google Speed Insights gives me a server connection error which states _"PageSpeed was unable to connect to the server. Ensure that you are using the correct protocol (_http vs https), the page loads in a browser, and is accessible on the public internet." Also, GTMetrix gives me an error as well, which states the following: "An error occurred fetching the page: HTTPS error: SSl connect attempt failed" All of my redirects seem to be set-up correctly as well as the SSL certificate. I've contacted my hosting provider (godaddy), they are saying that everything is fine with the server and the installation. Also, tried in different browsers in incognito mode, still gives me the same error. Until yesterday I haven't had such a problem. I've also attached the error screenshot links. I would really appreciate your help! Dmytro UxchPYR M52iPDf
Technical SEO | | solvid1 -
Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
hello Moz We know that this year, Moz changed its domain to moz.com from www.seomoz.org
Technical SEO | | joony
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above) We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess. How long would it take Google to refresh the index? We just don't worry about it? Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint) Thank you in advance for your reply.0 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
/index.php in sitemap? take it out?
Hi Everyone, The following was automatically generated at xml-sitemaps.com Should I get rid of the index.php url from my sitemap? If so, how do I go about redirecting it in my htaccess ? <url><loc>http://www.mydomain.ca/</loc></url>
Technical SEO | | RogersSEO
<url><loc>http://www.mydomain.ca/index.php</loc></url> thank you in advance, Martin0