Fetch as Google issues
-
HI all,
Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff.
I am however, troubled by one thing!
I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC.
Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk.
Cheers and looking forward to your comments....
Tim
-
It would appear that GSC has pretty much resolved itself. It may have simply been a glitch at the time.
-
Hey Tim, if DNS is working and nothing else to be done but waiting from Google to get it fixed, just keep waiting and one day you'll see a green sign when start working again. My site has shown with red alert then a yellow sign saying dns error, checked my dns and everything is working.
-
From what I can gather the DNS is fine, our A records are only pointing to our website? if it was DNS what would be missing?
-
Hey Tim, Blocked resources is totally different than DNS error, that shouldn't cause any issues. Check your DNS and make sure is working, again, if it's working, that is just something should be corrected by Google soon. Have you tried fetching it again?
Thanks!
Antulio
-
SSL is fine and so is our DNS from what I can gather.
I do get one notification for a blocked resource but this is an external element? will this cause any issues?
-
Hi Tim,
Firstly you might want to check if your certificate is working properly through https://www.sslshopper.com/ssl-checker.html then make sure check for any dns troubleshooting through www.who.is and if everything is working, it should be something from Google and just keep fetching it by giving a try in next day or couple hours, it happened to mine and everything is working now.
Good luck!
Antulio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google crawl drop
the crawl request of my company site: https://www.dhgate.com/ has dropped nearly over 95%, from daily 6463599 requests to 476493 requests at 12:00am on 9th, Oct (GMT+8). This dramatic dropping trend not only showed in our GSC crawl stats report but also our company's own log report. We have no idea what’s going on. We want to know whether there is an update of google about crawlling, or is this the issue of our own site? If something is wrong with our site, in what aspects would you recommend us to check, analyze and accordingly optimize?
Technical SEO | | DHgate_20140 -
Fetch as Google temporarily lifting a penalty?
Hi, I was wondering if anyone has seen this behaviour before? I haven't! We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword. I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds. I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax. Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied. Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh! So before I dig myself even deeper, has anyone any ideas? Thanks.
Technical SEO | | semcheck11 -
What does Google PageSpeed measure?
What does the PageSpeed tool actually measure? Does it say that a webserver is fast or slow? Thanks in advanced!
Technical SEO | | DanielMulderNL0 -
Disavow Issues
Hi We have a client who was hit by Penguin about 18 months ago. We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings. The client is asking me whether it would be better to dump the domain and move the website to a fresh domain. Can you provide thoughts / experience on this please? Thanks.
Technical SEO | | EffectiveSEOUK0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Subdomains Issue
Hi , We have created sub domains of our site to target various Geo´s. For example, geo, uk.site.com, de.site,com and all these sub domains have the same content as main domain. Will it affect our SEO Rankings? How can we solve this if it affects our rankings?
Technical SEO | | mikerbrt240 -
Does Google Read Javascript?
I would like to include a list of links in a select type box which I would like google to follow. In order to do this, I will be styling it with the help of javascript, and in turn change the select box into a ul and the options into li's. The li's would each contain a link, but if javascript is disabled it will fallback to a normal css styled select box. My question is would google follow the links made by the javascript? Or would the bot just recognize the select box as a select box and not links. Thanks for any help!
Technical SEO | | BrianJenkins0 -
Google support eTag?
Hello~ People! I have a questions regarding eTag. I know Google support If-Modified-HTTP-Header aka last modified header. I used eTag instead of last modified header. It seems like Google does support, yet here is my questions. code.google suggest as following. GData-Version: 2.0
Technical SEO | | Artience
ETag: "C0QBRXcycSp7ImA9WxRVFUk." but I used etag as following . ETag: "10cd712-eaae-b279a480" I didnt include "GData-Version: 2.0". is this mean Google may not support my etag?0