SeoMoz Crawler Shuts Down The Website Completely
-
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below)
Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately.
I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it.
Here is what caused it from these error lines:
216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
-
After much research and implementing ton of added scripts on my apache server to track it - the bots did effect the shutdown. However, for this not to happen to you or if you ever have a problem of that nature this is how I resolved it.
It is an excellent article about how to implement the script to restart immediately once all available threads for apache are exhausted and your apache crashes. The script basically check apache server status every 5 min and in an event that it crashed - it will automatically restart it and send you an email notification. I say pretty good deal for risking to be offline only 5 min if anything major happens. Just as well I am also running a cron job every morning at 1am to restart apache. Please note that you need to have knowledge of SSH commands and manipulations in order for this to happen. And OMG I am talking like a geek... All the best to you...
-
Wow Randy, what a story man. Actually the funny part is one of the jobs I do is monitor for things like that - but I would not go that far to actually shut someone's site down - precisely for the reason of knowing what that could do. It is great thing to know that for 5 days you still preserved your ranking. That makes me feel so much better. I am keeping the rule of 1 dedicated server per 2 domains (both related). In this whole case we are talking about a domain called babylifetime.com. I am about to embark on a journey of custom development for site similar to squarespace.com but with much more addons - so I need this thing to work properly. I think I got this SEO in organic arena pretty well, but again things like the issue in this thread are what is keeping me on my toes.
-
Googlebot would have to be indexing your site at the very moment that it was down for anything to happen and even if it's down for a half a day, from my experience, rankings are unaffected.
However, there's a small side-effect. If visitors that are coming from X, Y or Z engine, visit your site and there is a 404 or Server Error and they click the back button or get the "Google Can't Find This" page, it can, for that period of time increase your bounce rate. If the originating click starts at say Google and then the clicker goes back to google, it tells google that the page wasn't what they were looking for in relation to the term that they used, or that it didn't load, or that there is a problem with it. Basically any reason that can be tied to bounce rate.
As alarming as that may sound, I don't believe that it would effect your rankings.
The easiest way to see if Google noticed is to log in to your Google Webmaster Tools account and check for errors. If they list any errors such as 404 or "server unavailable" (which I'm not sure they have that one) for any pages that you know are usually live and well, then you'll know they noticed.
But again, I'm not under the belief that it will largely effect your rankings. I've read from Google's words that they do go back to sites that were unavailable or down and try to continue their index.
As for your server being down for 12 hours. That's a lengthy amount of time. I can't even imagine it. You might want to check your hosting capabilities. You should be back up and running in minutes, not hours.
Just to give you a some piece of mind. I have a plethora of affiliate sites that make a small income for me. I once registered a domain name that a very large corporation didn't appreciate. It had a trademarked word in the domain. Long story short, my domain info was set to private so they got legally got the server shut down. I didn't know for days because everything was on auto-pilot and I wasn't checking my related email addresses. When that server was shut down, 100+ websites on that server went down too because that one trademarked (partially) domain was on the same server and same hosting package. The sites were down for about 5 or 6 days while I sorted through the legal paperwork. After I made an agreement to give the big company the domain, minus the 20K in damages that they originally wanted, the hosting company turned the server and hosting package back on.
Not a single one of the domains lost ranking. Not even 1 spot! Today, they still rank in the top 2 to 3 of their biggest terms. So my words are truly from experience and are from a worst-case scenario. I think you'll be fine.
Finally, to clear the air. I didn't do anything bad, nor would I ever do anything bad with a domain name (other than keep it in my portfolio). The big company was upset that I got the domain before they did. All I had on the index page was their description of their product that was named in the domain. That was enough to be taken down for copyright and trademark infringement.
In the end, that company was actually very cool about it. And it's a Fortune 10 company! I was surprised!
-
EGOL thanks for your reply.
A) Also my latest though is that unusual activity is blocking it. But then again, it is dedicated server and should be capable of handling it separately. We are talking about SeoMoz bot and highest dedicated GoDaddy server. Without anything specifically installed to interfere with apache server.
B) RAM, bandwidth, space, PHP memory and other memory limits etc. is all under 20% of actual use.
-
I am willing to bet that the root issue is with the host and one of these situations is occurring: A) the host is throttling your processor resources and shutting your domain down after unusual activity occurs on your site.... B) total activity on the server (your site and other sites) exceed a certain level and the server limits resource available for processing.
I would be looking for a new host.
-
Randy thanks for the response. There is definitely something going on related directly to rogerbot on the server. I have different crawlers running at all times and nothing ever happens. This particular problem ties in when seomoz bots start doing their job (fridays) and is backtracked to specific bot.As for delay. I tried different ones up to 20 - but same problem persists.
At the moment I have tech team reviewing apache server to see specifics of this. I will also post it here for other to see when I find out.
But it is weird and now I don't know when the site will shut down. Driving me crazy man!
As additional question to this thread: When your site goes down for lets say 12 hours and you have many organic google high ranked listings. Does that have huge impact or what is acceptable?
-
Jury,
I'm not sure if rogerBot is doing anything to you site but I do know a way to slow rogerBot and any other robot / crawler which takes directions from the robots.txt file that should be on your site.
Basically, just add the two lines of code that are represented below to your robots.txt file. With this addition, you are telling the useragent (rogerBot) to take 10 seconds between pages. You can change that number to anything you want. The more seconds you add, the slower it goes. And this of course is if rogerBot takes directions. I'm fairly sure it does!
NON-AGENT SPECIFIC EXAMPLE
User-Agent: *
Crawl-Delay: 10
EXAMPLE FOR ROGERBOT
User-Agent: rogerBot
Crawl-Delay: 10
Good Luck,
Randy -
Thanks Lewis...I will do that and see if they have any suggestions...!
-
Hi Jury
If you haven't already i would recommend raising the issue through the help email address help@seomoz.org
On the Q&A forum we can pass thoughts or suggestions but the support team at seomoz will be best placed to answer this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Website SEO Implications
Hi Moz Community, A client of mine has launched a new website. The new website is well designed, mobile friendly, fast loading and offers a far better UX than the old site. It has similar content but 'less wordy'. The old website was tired, slow, not mobile responsive etc but still ranked well. The domain has marketing leading authority and link metrics. Since the launch, the rankings for virtually every word has plummeted. Even previously ranked #1 words have disappeared to page 3 or 4. New pages have different URLs (301s from the old urls are working fine) and still score the same 98% (using the Moz page optimiser tool). Is it usual to experience some short term pain, or are these rankings drop an indication that something else is missing? My theory is that the new URLs are being treated like new pages, and that those new pages don't have the engagement data which is used for ranking. Thus, despite having the same authority of the old pages, as far as user data is concerned, they are new pages and therefor, not ranking well - yet. That theory would make logical sense but I'm hoping some experts here can help. Any suggestions welcome. Here's a quick checklist of things I have already done: complete 301 redirect list
Intermediate & Advanced SEO | | I.AM.Strategist
New sitemap
Submitted to console
Created internal links from within their large blog
Optimised all the new pages (img alts, H1s etc) Extra info: Platform changed from Wordpress to Expression engine
Target pages now on level 3 not level 2 (extra subfolder used)
Less words used (average word count per page from 400+ to 250) Thanks in advance 🙂0 -
My website is ranking well on most of keywords. How do I find more keywords in order to drive more traffic to my website?
I have a website which is ranking well on some good keywords ie generic and long tail. It is also ranking for some really competitive keywords. and now getting constant traffic. I want to increase organic traffic to my website. What are the best possible ways to do this? How to research more keywords and how to identify that they will really work? Please help, I am confused.
Intermediate & Advanced SEO | | rishi.ast0 -
Allowing Guest Posts on Website
I am planning to allow Guest posts on my website blog where authors or writers can post articles to my blog. All content will be manually approved by me. But i would like to know if guest posts will cause any harm to my site or is it good way to generate content. Another thing as guest posts will be having do-follow links & author section too will be up. So firstly i would like to know whether its a good step or not? Would even like to know what checks should i do before approving a guest posts?
Intermediate & Advanced SEO | | welcomecure0 -
Submitting XML Sitemap for large website: how big?
Hi there, I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages. Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted. Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index. Finally, my questions are: How big should a sitemap be? What proportion of the URLs of a website should it cover? What are the best tools for creating the sitemaps of large websites? How often should a sitemap be updated? Thanks 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Coupon Website Has Tons of Duplicate Content, How do I fix it?
Ok, so I just got done running my campaign on SEOMOZ for a client of mine who owns a Coupon Magazine company. They upload thousands of ads into their website which gives similar looking duplicate content ... like http://coupon.com/mom-pop-shop/100 and
Intermediate & Advanced SEO | | Keith-Eneix
http://coupon.com/mom-pop-shop/101. There's about 3200 duplicates right now on the website like this. The client wants the coupon pages to be indexed and followed by search engines so how would I fix the duplicate content but still maintain search-ability of these coupon landing pages?0 -
Are there any SEO Tips before killing a website?
Hey guys, My company acquired another company, and after a couple of months we decided to completely kill their website. I'm not finding any info about SEO best practices for this type of situation. From the "switching domains" and "new sites" articles and blog posts I can extrapolate that I should: 301 redirect their home page to ours Look at specific pages with good authority that relate to our pages and 301 them. Look at the strongest backlinks to their site and try to change them to point to our site. Create a 404 page for the rest of their webpages that tells them that we acquired the company (hopefully with a main menu and search bar) Any other suggestions?
Intermediate & Advanced SEO | | nrv0 -
Redirecting multiple websites to a single website
I've been trying to run several truck accessory affiliate websites for a quite a while now. I've recently decided to combine all of my affiliate websites into a single community website. This way I'll be able to focus all my energy and link building into a single place and build up a single brand. My question is, how many websites do I try to redirect to the new website at a time? Do I need to spread this out? Or is it ok if I move all of my content and websites at a single time? I have around 30 websites that I could move to this new domain. Thanks! Andy
Intermediate & Advanced SEO | | daenterpri0 -
One Website - Local + National Ranking
If a client (e.g. a winery) wants to rank both nationally and locally, what are some best practices for doing this on one Website? So the goal is to: Rank nationally for their wines, wine varietals, etc.so they're found by restaurants, distributors, customers (could include national directories, content creation ,etc.) Rank locally for their tasting room and wines for people looking locally or looking at that specific region (this could also include include Google places, local directories, etc.). I'm wondering if the site would need to be subdivided (or "siloed") where one section is heavily focused on national and another is on regional? Also, for the home page, which focus would be most important (maybe national because it's harder)? Thanks a for any ideas! Tom
Intermediate & Advanced SEO | | DirectionSEO0