Fetch as GoogleBot "Unreachable Page"
-
Hi,
We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors".
We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month.
We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month.
Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ?
Regards,
shaz
-
Its a glitch. Google knows about it. Been happening on all sites we maintain for about 4 days now.
Heard this morning that they are aware of it and it will be fixed soon.C
-
Hi Shaz,
I have also started noticing that this is happening over a couple of my accounts. I think that it may have something to do with the recent Google update (however, I could be wrong).
Hopefully this is just a glitch from Google. If anyone does have any info, I'd really like to know as well!
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Search Console - Best practice to fetch pages when you update them?
Hi guys, If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console? My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page. Thoughts? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
How much does "Sud-domain SEO optimisation" improves website ranking?
Let's say there is a website(domain) and couple of sub-domains (around 6). If we optimise all sub-domains with "keyword" we want our website to rank for.....like giving "keyword" across all page titles of sub-domains and possible places which looks natural as brand mentions. Will this scenario helps website to rank better for same "keyword"? How can these sub-domains do really influence website in rankings? Like if the sub-domains have broken links, will this affect website SEO efforts?
Intermediate & Advanced SEO | | vtmoz0 -
72KB CSS code directly in the page header (not in external CSS file). Done for faster "above the fold" loading. Any problem with this?
To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)
Intermediate & Advanced SEO | | lcourse0 -
Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
I saw some people freaking out about this on some forums and thought I would ask. Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
Intermediate & Advanced SEO | | BlueLinkERP0 -
Canonical URL's - Do they need to be on the "pointed at" page?
My understanding is that they are only required on the "pointing pages" however I've recently heard otherwise.
Intermediate & Advanced SEO | | DPSSeomonkey0