Is it Okay to have "No Response" pages?
-
Hi all,
I can see some "No Response" pages which gives a error message "Site cannot be reached" or keeps on loading but don't. I have got this list from Screaming from spider tool. Do we need to fix these or ignore?
Thanks
-
Yes, definitely fix this issue, as sooner or later Google will see that your website speed is experiencing errors. I would check with your website host for speed, optimization and reliability.
-
Hi,
Definitely fix! If this is true than it means that some of your actual bisitors also encounter this, which means pretty poor page analytics, which can be very harmful for SaeO ofc.
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not showing the recent cache info: How to know the last cached version of a page?
Hi, We couldn't able to see the last Google cached version of our homepage after March 29th. Just wondering why this is happening with other websites too. When we make some changes to the website, we will wait to our website indexed and cached, so the changes will have some ranking impact. Now we couldn't able to check if the website got indexed with changes. Is there any other way to check the latest cached version or time of last index? Thanks
Algorithm Updates | | vtmoz0 -
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
Does it matter? 404 v.s. 302 > Page Not Found
Hey Mozers, What are your thoughts of this situation i'm stuck in all inputs welcome 🙂 I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website. I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404) Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
Algorithm Updates | | rpaiva0 -
How long does google take to re-ranking pages in results?
I mean when google dance, the pages in results go up and down frequency every minue, but finally your page will rank in any position in google, what is the time when you get another position in google
Algorithm Updates | | engtamous0 -
How could Google define "low quality experience merchants"?
Matt Cutts mentioned at SXSW that Google wants to take into consideration the quality of the experience ecommerce merchants provide and work this into how they rank in SERPs. Here's what he said if you missed it: "We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.” My question; how exactly could Google decide if a merchant provides a low and high quality experience? I would image it would be very easy for Google to decide this with merchants in their Trusted Store program. I wonder what other data sets Google could realistically rely upon to make such a judgment. Any ideas or thoughts are appreciated.
Algorithm Updates | | BrianSaxon0 -
Not necessary to have keywords in the page? Do you agree?
I am being told by my SEO consultants that: "According to present Google algorithm it is not necessary to have keywords in the page. What is more required is the content is relevant to the page and whether visitors will stay on that page or not. If visitors stay for a longer time in your site it add bonus to the ranking of the site. So I think it is not necessary to add key phrases in the content." Do you agree?
Algorithm Updates | | PegCorwin0 -
Too Many On-Page Links
After running a site analysis on here it has come up and said that I have a lot o pages with too many on page links and that this might be why the site is being penalized. Thing is I am not sure how to remedy this as one page that says it has 116 links is this one : http://www.whosjack.org/10-films-with-some-crazy-bitches/ Although there is only one link in the body Then again our home page has 165 http://www.whosjack.org which again it says is too many. The thing is is that surely it doesn't count on links all over the page as other wise every news homepage would be penalised? For example what would happen here on this home page? : http://www.dazeddigital.com/ Can anyone help me see what I am missing? Are there possible hidden links anywhere I should be looking for etc? Thanks
Algorithm Updates | | luwhosjack0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0