Scary bug in search console: All our pages reported as being blocked by robots.txt after https migration
-
We just migrated to https and created 2 days ago a new property in search console for the https domain.
Webmaster Tools account for the https domain now shows for every page in our sitemap the warning: "Sitemap contains urls which are blocked by robots.txt."Also in the dashboard of the search console it shows a red triangle with warning that our root domain would be blocked by robots.txt. 1) When I test the URLs in search console robots.txt test tool all looks fine.2) When I fetch as google and render the page it renders and indexes without problem (would not if it was really blocked in robots.txt)3) We temporarily completely emptied the robots.txt, submitted it in search console and uploaded sitemap again and same warnings even though no robots.txt was online4) We run screaming frog crawl on whole website and it indicates that there is no page blocked by robots.txt5) We carefully revised the whole robots.txt and it does not contain any row that blocks relevant content on our site or our root domain. (same robots.txt was online for last decade in http version without problem)6) In big webmaster tools I could upload the sitemap and so far no error reported.7) we resubmitted sitemaps and same issue8) I see our root domain already with https in google SERPThe site is https://www.languagecourse.netSince the site has significant traffic, if google would really interpret for any reason that our site is blocked by robots we will be in serious trouble.
This is really scary, so even if it is just a bug in search console and does not affect crawling of the site, it would be great if someone from google could have a look into the reason for this since for a site owner this really can increase cortisol to unhealthy levels.Anybody ever experienced the same problem?Anybody has an idea where we could report/post this issue? -
Hi icourse, thanks for your question! You've received some thoughtful responses. Did any of them help you sort your issue out? If so, please mark one or more as a "Good Answer." Thanks!
Christy
-
I'd still speak with the hosting provider. It may be a firewall setting, not the CDN.
-
Hi Donna, we are using cloudflare which may have blocked your scan/bot.
We disabled cloudflare temporarily and submitted new robots.txt and new sitemap and still get the same warnings. -
I recently updated a Wordpress website from noindex to wanting it indexed and I still had a warning in Search Console for a day or two and the homepage was initially indexed with the meta description saying "A description for this website ...", even though the actual Fetch and Render and also Robots.txt test was just fine. If you are absolutely sure there isn't anything wrong, I would maybe give it a couple of days. In our case, I resubmitted the homepage to Google to speed up the process and that fixed it.
-
Have you checked with your website hosting provider? They may be blocking bots at a server level. I know when I tried to scan your site I got a connection timeout error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge Search Traffic Drop After Switching to HTTPS - No Recovery After Couple of Months
Hi In November, we have switched our website (https://www.insidermonkey.com) from HTTP to HTTPS. Initially, we noticed slight search traffic loss but later discovered it might be due to HTTPS switch. A month later we added the https version at search console, and then saw an immediate huge drop (about 25-30%). We discovered the problem might be due to poor redirection and noticed our redirects were 302s instead of 301s. To fix the problem, we implemented the 301 redirects and submitted the sitemap containing links to the old site at the new search console property (https). We've gone through points listed on the page below: https://support.google.com/webmasters/answer/6073543 We fixed the redirects to 301 Double-checked the sitemaps Made sure we had a properly installed SSL certificate (Now, we get A+ from https://www.ssllabs.com/ssltest/analyze.html?d=www.insidermonkey.com) Made sure we have no mixed-content errors (we don't have any issues at search console.) We only avoided implementing HSTS, in case we might want to switch back to HTTP.
Intermediate & Advanced SEO | | etakgoz
We had a small improvement in the following month, but our traffic did not fully recover. We wanted to test for the possibility to switch back HTTP by switching only 2 articles in our CMS to HTTP. Our traffic got worse, not only for those but for the whole site. Then we switched back those 2 articles to HTTPS again and implemented HSTS. It seems our search traffic getting worse day by day with no sign of improving. In the link below you can find the screenshot of our weekly search traffic between 1 October - 1 March. We are down from 500K weekly visitors to mere 167K last week. https://drive.google.com/open?id=1Y1TQbj_YtGG4NhLORbEWbvITUkGKUa0G Any ideas or suggestions? We are willing to get professional help as well. What is the way to find a proper consultant for such problem with relevant experience?0 -
News Errors In Google Search Console
Years ago a site I'm working on was publishing news as one form of content on the site. Since then, has stopped publishing news, but still has a q&a forum, blogs, articles... all kinds of stuff. Now, it triggers "News Errors" in GWT under crawl errors. These errors are "Article disproportionately short" "Article fragmented" on some q&a forum pages "Article too long" on some longer q&a forum pages "No sentences found" Since there are thousands of these forum pages and it's problem seems to be a news critique, I'm wondering what I should do about it. It seems to be holding these non-news pages to a news standard: https://support.google.com/news/publisher/answer/40787?hl=en For instance, is there a way and would it be a good idea to get the hell out of Google News, since we don't publish news anymore? Would there be possible negatives worth considering? What's baffling is, these are not designated news urls. The ones we used to have were /news/title-of-the-story per... https://support.google.com/news/publisher/answer/2481373?hl=en&ref_topic=2481296 Or, does this really not matter and I should just blow it off as a problem. The weird thing is that we recently went from http to https and The Google News interface still has us as http and gives the option to add https, which I am reluctant to do sine we aren't really in the news business anymore. What do you think I should do? Thanks!
Intermediate & Advanced SEO | | 945010 -
Site migration - 301 or 404 for pages no longer needed?
Hi I am migrating from my old website to a new one on a different, server with a very different domain and url structure. I know it's is best to change as little as possible but I just wasn't able to do that. Many of my pages can be redirected to new urls with similar or the same content. My old site has around 400 pages. Many of these pages/urls are no longer required on the new site - should I 404 these pages or 301 them to the homepage? I have looked through a lot of info online to work this out but cant seem to find a definative answer. Thanks for this!! James
Intermediate & Advanced SEO | | Curran0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Why isn't my uneven link flow among index pages causing uneven search traffic?
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
Intermediate & Advanced SEO | | GilReich0 -
Block search bots on staging server
I want to block bots from all of our client sites on our staging server. Since robots.txt files can easily be copied over when moving a site to production, how can i block bots/crawlers from our staging server (at the server level), but still allow our clients to see/preview their site before launch?
Intermediate & Advanced SEO | | BlueView13010 -
Previously ranking #1 in google, web page has 301 / url rewrite, indexed but now showing for keyword search?
Two web pages on my website, previously ranked well in google, consistent top 3 places for 6months+, but when the site was modified, these two pages previously ending .php had the page names changed to the keyword to further improve (or so I thought). Since then the page doesn't rank at all for that search term in google. I used google webmaster tools to remove the previous page from Cache and search results, re submitted a sitemap, and where possible fixed links to the new page from other sites. On previous advice to fix I purchased links, web directories, social and articles etc to the new page but so far nothing... Its been almost 5 months and its very frustrating as these two pages previously ranked well and as a landing page ended in conversions. This problem is only appearing in google. The pages still rank well in Bing and Yahoo. Google has got the page indexed if I do a search by the url, but the page never shows under any search term it should, despite being heavily optimised for certain terms. I've spoke to my developers and they are stumped also, they've now added this text to the effected page(s) to see if this helps. Header("HTTP/1.1 301 Moved Permanently");
Intermediate & Advanced SEO | | seanclc
$newurl=SITE_URL.$seo;
Header("Location:$newurl"); Can Google still index a web page but refuse to show it in search results? All other pages on my site rank well, just these two that were once called something different has caused issues? Any advice? Any ideas, Have I missed something? Im at a loss...0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0