Fetch as Google issues
-
HI all,
Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff.
I am however, troubled by one thing!
I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC.
Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk.
Cheers and looking forward to your comments....
Tim
-
It would appear that GSC has pretty much resolved itself. It may have simply been a glitch at the time.
-
Hey Tim, if DNS is working and nothing else to be done but waiting from Google to get it fixed, just keep waiting and one day you'll see a green sign when start working again. My site has shown with red alert then a yellow sign saying dns error, checked my dns and everything is working.
-
From what I can gather the DNS is fine, our A records are only pointing to our website? if it was DNS what would be missing?
-
Hey Tim, Blocked resources is totally different than DNS error, that shouldn't cause any issues. Check your DNS and make sure is working, again, if it's working, that is just something should be corrected by Google soon. Have you tried fetching it again?
Thanks!
Antulio
-
SSL is fine and so is our DNS from what I can gather.
I do get one notification for a blocked resource but this is an external element? will this cause any issues?
-
Hi Tim,
Firstly you might want to check if your certificate is working properly through https://www.sslshopper.com/ssl-checker.html then make sure check for any dns troubleshooting through www.who.is and if everything is working, it should be something from Google and just keep fetching it by giving a try in next day or couple hours, it happened to mine and everything is working now.
Good luck!
Antulio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from a Google penalty
Hi there, So about 3.5 weeks ago I noticed my website (www.authenticstyle.co.uk) had gone from ranking in second place for our main key phrase "web design dorset" to totally dropping off the SERP's for that particular search phrase - it's literally no where to be seen. It seems that other pages of my website still rank, but the homepage. I then noticed that I had an unread alert in my Google Search Console account to say that a staging site we were hosting on a subdomain (the subdomain was domvs.authenticstyle.co.uk) had hacked content - it was a couple of PDF files with weird file names. The strange thing is we'd taken this staging site down a few weeks earlier, BUT one of my staff had left an A record set up in our Cloudflare account pointing to that staging server - they'd forgotten to remove it when removing the staging site. I then removed the A record, myself and submitted a reconsideration request on Google Search Console (which I still haven't received confirmation of) in the hope of everything sorting itself out. Since then I've also grabbed a Moz Pro account to try and dig a little deeper, but without any success. We have a few warnings for old 404's, some missing meta descs on some pages, and some backlinks that have accumulated over time that have hghish spam rating, but nothing major - nothing that would warrant a penalty as far as I can tell. From what I can make out, we've been issued a penalty on our homepage only, but I don't understand why we would get penalised for hacked content if that site domvs.authenticstyle.co.uk no longer existed (would it just be due to that erroneous A record we forgot to remove?). I contacted a few freelance SEO experts and one came back to me saying I'd done everything correctly and that I should see our site appearing again in a few days after submitting the reconsideration request. Its been 3 weeks and nothing. I'm at a huge loss as to how my site can recover from this. What would you recommend? I even tried getting our homepage to rank for a variation of "web design dorset", but it seems our homepage has been penalised for anything with "dorset" in the keyphrase. Any pointers would be HUGELY appreciated. Thanks in advance! Will
Technical SEO | | wsmith7270 -
Issues with Magento layered navigation
Hi, We use Magento v.1.7 for our store. We have recently had an SEO audit and we have uncovered 2 major issues which can be pinpointed to our layered navigation. We use the MANAdev layered navigation module. There are numerous options available to help with SEO. All our filtered urls seem to be fine ie. https://www.tidy-books.co.uk/childrens-bookcases-shelves/colour/natural-finish-with-letters/letters/lowercase have canonical url correctly setup and the meta tags as noindex, follow but Magento is churning out tons of 404 error pages like this https://www.tidy-books.co.uk/childrens-bookcases-shelves/show/12/l/colour:24-4-9/letters:6-7 which google is indexing I'm at lost at how to solve this any help would be great. Thank you **This is from our SEO audit report ** The faceted navigation isn’t handled correctly and causes two major issues:● One of the faceted navigation filters causes 404 error. This means that the error isappended each sequence of the navigation options, multiplying the faulty URLs.● The pages created by the faceted nav are all accessible to the search engines. Thismeans that there are hundreds of duplicated category pages created by one of theparameters. The duplication issues can seriously hinder the organic visibility.The amount of 404 errors and the duplicated pages created by faceted navigation makes italmost impossible for a search engine crawler to finish the crawl. This means that the sitemight not be fully indexed and the newly introduced product pages or content won’t bediscovered for a very long time.
Technical SEO | | tidybooks0 -
Meta description issue on Google
Hello, I have a small issue on Google with our Meta Description tag not always being properly displayed. If you search for the term: Globe Car (in two words), everything is being displayed properly: http://screencast.com/t/YQCUkJnk Now do the same search for the term GlobeCar (in one word) and the meta tag set into our homepage seems to be totallly ignored and Google is now displaying something that is generated from out of their hat: http://screencast.com/t/K0KeeRGSgspV Anyone has an idea what would cause this? Thanks!
Technical SEO | | GlobeCar1 -
Pagination and Canonocal Issue
Hi, I have a site which have city wise pages and in a given city we have categories. The listed products can be listed in different categories which have separate URL. The site have different URL, meta, title for each category. We want to Rank these pages based on category also... What is best way to avoid duplicate and canonical issue.. Thanks,
Technical SEO | | dsingh1079
Darshan..0 -
Http and https issue in Google SERP
Hi, I've noticed that Google indexing some of my pages as regular http, like this: http://www.example.com/accounts/ and some pages are being indexed as https, like this: https://www.example.com/platforms/ When I've performed site audit check in various SEO tools I got something around +450 pages duplicated and showing me pairs of the same URL pages, one time with http and one time with https. In our site there is the possibility for people to register and and open an account, later on to login to our website with their login details. In our company I'm not the one that is responsible for the site's maintenance and I would like to know if this is an issue, and if this is an issue - to know what causing it and how to fix it so I'll be able to forward the solution to the person in charge. Additionally I would like to know in general, what is the real purpose of https vs. http and to know what is the preferred method that our website should use. Currently when URLs are typed manually to the address bar, all the URLs are loading fine - with or without https written at the start of each URL. I'm not allowed to expose our site's name, this is why I wrote example.com instead, I hope you can understand that. Thank you so much for your help and I'm looking forward reading your answers.
Technical SEO | | JonsonSwartz0 -
Google Webmasters Quality Issue Message
I am a consultant who works for a website www.skift.com. Today we received an automated message from Google Webmasters saying our site has quality issues. Since the message is very vague and obviously automated I was hoping to get some insight into whether this message is something to be very concerned about and what can be done to correct the issue.From reviewing the Webmasters Quality Guidelines, the site is not in violation of any of the guidelines. I am wondering if this message is generated as a results of licensing content from Newscred, as I have other clients who are licensing content from Newscred and getting the same message from Google Webmasters.Thanks in advance for any assistance.
Technical SEO | | electricpulp0 -
Having to type Google CAPTCHA all the time
Hi guys, Our office has about 15 computers all on the same IP address and about 10 actively search on Google. Recently we have been asked to type in CAPTCHA almost every single time searching on Google and would like to know if you have any suggestions of resolving this. We do use Firefox Rank Checker to check ranking once per week (around 400 keywords) but we use Hide My Ass to hide the IP. No malware or virus detected on computers in the network. Many thanks for your help in advance David
Technical SEO | | sssrpm0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0