Fetch as GoogleBot "Unreachable Page"
-
Hi,
We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors".
We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month.
We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month.
Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ?
Regards,
shaz
-
Its a glitch. Google knows about it. Been happening on all sites we maintain for about 4 days now.
Heard this morning that they are aware of it and it will be fixed soon.C
-
Hi Shaz,
I have also started noticing that this is happening over a couple of my accounts. I think that it may have something to do with the recent Google update (however, I could be wrong).
Hopefully this is just a glitch from Google. If anyone does have any info, I'd really like to know as well!
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Is google seeing "all" my homepage?
Hello All 🙂 Since launching my new website design - www.advanced-driving.co.uk I am not convinced Google is seeing all the content on the page. I took a long extract of text and did a search on Google and nothing was found. Also although in the search results for "advanced driving course" I can see the new title tag, the snippet isn't showing.. Is there anyway I can check this? As a scroll down I can see the URL changes ie: www.advanced-driving.co.uk
Intermediate & Advanced SEO | | robert78
then:
http://www.advanced-driving.co.uk/#da-page_in_widget-3
then:
http://www.advanced-driving.co.uk/#da-page_in_widget-4
then:
http://www.advanced-driving.co.uk/#da-page_in_widget-5 Is this right? Thanks in advance..0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Your typical blog disclosure. "We received a free product but are not financially compensated".
Good afternoon & Happy Friday! I've ran into the following disclosure multiple times on different blogs. It seems to me like it would be a red flag and counter productive for both the blogger and the brand sending the samples as "free samples" are subject to google link scheming. Am I correct? What are your thoughts on bloggers using this disclaimer in regards to SEO? Disclosure: Some of these products were samples provided to me to try. Opinions and the choice to review are 100% my own! I was not financially compensated for writing this blog post. This post contains affiliate links.
Intermediate & Advanced SEO | | 90miLLA0 -
Given the new image mismatch penalty, is watermarking considered "cloaking"?
Google has released a new penalty called "Image mismatch". Which actually penalizes sites that show images to Google than are not the same as the ones offered to users when accessing the site. Although I agree with those sites that the image is completely different that the one shown in image search, lately I've seen lots of big sites using some king of watermark or layer that reads something like "To see the high quality of this image, click here" in order to "force" the user to visit the site hosting the image. Considering the latest changes to Google's image search, which made lots of sites lose their "image search traffic", are these techniques considered part of the new penalty Google is applying? Or does it only apply to the first scenario when the image is completely different? You can read more on this new penalty here.
Intermediate & Advanced SEO | | FedeEinhorn0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Pricing Page vs. No Pricing Page
There are many SEO sites out there that have an SEO Pricing page, IMO this is BS. A SEO company cannot give every person the same quote for diffirent keywords. However, this is something we are currently debating. I don't want a pricing page, because it's a page full of lies. My coworker thinks it is a good idea, and that users look for a pricing page. Suggestions? If I had to build one (which I am debating against) is it better to just explain why pricing can be tricky? or to BS them like most sites do?
Intermediate & Advanced SEO | | SEODinosaur0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030