Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Will Google penalize 2 sites for targeting "like" keyword phrases?
I own (2) different websites, one an HTML site that has been live for 20 years and a ecommerce site that has been live for 7 years. We sell custom printed (branded) tents for use at trade shows and other indoor and outdoor events. While our ecomm site targets "trade show" tents our HTML site targets "event" tents. I believe that the keyword phrases are dissimilar enough that targeting "trade show tents" on one site and "event tents" on the other should not cause Google to penalize one or the other or both sites for having similar content. The content is different on both sites. I'm wondering if anyone has experience with, or opinions on, my thoughts... either way. Thanks,
Algorithm Updates | | terry_tradeshowstuff
Terry Hepola0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Images not getting indexed in google image search :( " site: hdwallpaperzones.com " )
hi as i have mentioned in title.. my website images are not getting indexed in google image search engine.. out of 360 images only 5 got indexed from 3 days.. please help me out.. thanks
Algorithm Updates | | toxicpls0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
"No Follow", C Blocks and IP Addresses combined into one ultimate question?
I think the the theme of this question should be "Is this worth my time?" Hello, Mozcon readers and SEO gurus. I'm not sure how other hosting networks are set up, but I'm with Hostgator. I have a VPS level 5 which (I think) is like a mini personal server. I have 4 IP addresses, although it is a C block as each IP address is off by one number in the last digit of the address. I have used 3 out of the 4 IP addresses I have been given. I have added my own sites (some high traffic, some start-ups) and I've hosted a few websites that I have designed from high paying customers. -one man show, design them, host them and SEO them With the latest Penguin update, and with learning that linking between C Block sites is not a great idea, I have "No Followed" all of the footer links on client sites back to my portfolio site. I have also made sure that there are no links interlinking between any of my sites as I don't see them in the Site Explorer, and I figure if they aren't helping, they may be hurting the rankings of those keywords. Ok, so...my question is: "I have one IP address that I'm not using, and I have a popular high traffic site sharing it's IP with 5 other sites (all not related niches but high quality) Is it worth it to move the high traffic site to it's own IP address even though making the switch would take up to 48hrs for process to take affect? -My site would be down for, at the most 2 days (1 and a half if I switch the IP's at night) Is this really worth the stress of losing readers? Will moving a site on an IP with 5 other sites help the rankings if it was to be on it's own IP? Thank you very much ps- I can't make it to MOZcon this year, super bummed
Algorithm Updates | | MikePatch0