Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
How do I code SEO for a secondary site without impacting the main site?
We have a secondary site for our online magazine, how do I code the SEO so I don't steal links from the main site?
Algorithm Updates | | gacwebteam0 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
Our company is mentioned on some high-traffic, authoritative sites and some of our products are linked as well. If we link to those pages, does it affect our SEO? How can we take advantage of those mentions?
I heard that if you link to another site, when Google indexes your site, they crawl that page that is referenced. By whatever metrics they use, if that site has your name or a link to your site, Google would rank it higher. I am not sure how true that is, but what value does another site mentioned our site have on our SEO?
Algorithm Updates | | JonathonOhayon1 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Google Shopping Blocking All Vitamins and Natural Products - Glitch or Deliberate Censorship?
Hi everyone. We have a client that manufactures and supplies dietary supplements all around the world. We are slightly concerned that a recent Google shopping glitch (or change) is now seemingly excluding products from the shopping search results. This currently appears only to be happening in the US but we are really concerned, as our client ships all over the world and the potential loss of revenue could be quite large. There is already a YouTube video that demonstrates what is going on which is available below: http://www.youtube.com/watch?v=zNDyS0tF4dY Just to clarify, these are products that should not be included in any of Google's “sensitive” categories as they currently stand. Taking Vitamin B12 as an example, it is recognised as a permissible dietary supplement within pretty much every regulatory framework around the world, including those governed by the US FDA, The Euopean Commission and the Australian TGA. Therefore there would be no legal reasons to prevent it's inclusion in shopping results in any country. Has this just slipped under the radar or can anyone point us to a resource that may be able to clarify why this has happened? Thanks in advance guys!
Algorithm Updates | | AduroLabs0 -
Dramatic drop after rapid rise for new site
just launched a new site edenprairieexperts.com. The site jumped to the first page on yahoo and bing within a couple of days then fell off a cliff and isnt in the top 10 pages. Any reason for this? seems really strange for me. The only think I can think of is I got some really poor quality back links from someone screwing with me. If someone could take a glance at the site or give me some general direction I would appreciate it.
Algorithm Updates | | jjwelu0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0