Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
-
I would like some third party input... partly for my sanity and also for my client.
I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center)
Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site.
To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google)
Their keyword rankings are taking a quick dive as well.
Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host.
I believe:
1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used)
2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server.
3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear.
Please give your input and thoughts. I really appreciate it!
-
Thanks so much for your response. Glad to hear that there was a fairly good ending to this, and thanks for following up!
-
Keri -
I was able to produce multiple reports that accomplished the following:
1.) Illustrate the quick/graphic drop in our Google rankings
2.) Illustrate that the majority of traffic comes from Organic Search
3.) Tie together the trend that was already happening of a dramatic drop in organic traffic as keywords were slipping.
4.) Bring a reality to the fact that this behavior will quickly result in such a steep financial hit it was an 'emergency'.
In this particular situation there is My Client, Their Developers, Me the SEO guy.
I found that by helping my client to understand the situation, financial impacts, and why we had to act helped to spur things along.
We had a big meeting with everybody involved. This was a great opportunity to understand what the developers (Who also serve as the host) were going through. Facing attacks from bots, trying to keep the server alive, etc. etc. Ultimately it was revealed that they had a bug in their code for Magento that was causing a lot of extra DB hits that was a main root cause of their issues.
We were able to work out the following ground rules:
1.) NEVER block all bots under any circumstance
2.) The majority of our Organic traffic is Google then Bing/Yahoo were a tiny fraction and everything else didn't matter. I crafted a good robots.txt that let in all of the major bots I wanted and excluded the rest. Ideally I'd like to include most all of them. (Since we're only blocking good bots because the bad ones will just ignore) However, I wanted to compromise and also help them with server traffic. (PLUS for us Google is it.) I did make sure my robots.txt allowed in all Google services, etc.
3.) We set up a system to make sure everybody was in the loop when a dramatic decision regarding the website was made. (that's way better than me finding out a few days later that Google was blocked and damage has already been done)
4.) We really brought into light that SEO has/had nothing to do with the situation.
In the end the developers are great people but like everything else... they almost need to see you in person and hear why they can't do stuff like that. In their world it makes total sense because the server is overloaded. However, there won't be an overloaded server if you block out Google and all the traffic it sends.
We were able to recover most of our rankings and our traffic returned back to normal. We aren't quite back to where we were but getting there. The keywords snapped back fairly quickly but the organic traffic didn't so it might be something else. I actually will throw in a screenshot of the incident down below.
Thanks for checking up on it!
-
Hi Joshua, I'm looking through some older threads, and wondering if you're able to give us any type of update as to what happened in this case (and if you have any hair left!). I've had some battles with developers before too, and have sympathy for your position.
-
Thanks for your answers and help everyone! I really appreciate all of the details. I see the power of this community and hope to be able to contribute instead of only take in the future. Thanks again!!
-
Drop them as a client. They're paying you for SEO help but they obviously don't trust/like it. Not worth your time.
-
Quite clearly, this is bonkers.
If you block access from search engine spiders how can they possibly index the content in good faith? You are hoping that they will not crawl the content to check what is there (or burn server resources) but they will still happily refer users of their search engine to these pages in good faith.
Additionally, it is highly unlikely that bots from the major engines are causing a measurable impact - Google for instance states they will only crawl one page every few seconds (1).
That said, there are a lot of parasites out there and crawlers that will eat up server time so there may also be some truth in what the host is saying. That said, there is still no excuse for this hatchet job of sorting things out.
The other angle here is that magento and ecommerce sites can often be a crawlers worst nightmare. As an example product comparison systems can often create thousands (I have seen millions) of crawlable URLs - now a sensible spider has a crawl budget and will give up but that's not saying all will. A simple crawl in screaming frog should give you an idea here (not that you will be able to do that) and in many cases where these problems exist this is enough to bring a server to it's knees.
In my mind you have a few things to do here
1. Convince the host that blocking all spiders is incorrect
Hopefully this thread and the references here should be more than enough to do this. Beyond that simply show them a fetch as Googlebot & the Crawl section in webmaster tools and you should be able to make your point quickly and easily.
2. Help the developer implement a more sensible list of what to block.
This article is a good start here:
http://searchenginewatch.com/article/2067357/Bye-bye-Crawler-Blocking-the-ParasitesRemember you can allow one (or more) robots and then disallow everything else:
User-agent: Google Disallow: User-agent: * Disallow: /
Other options also exist such as limiting the speed at which a crawler can crawl - well, requesting that they limit the speed at which they crawl.
Also remember that any truly parasitic bot or crawler will likely ignore robots.txt anyway so you may need to implement some more advanced blocking at a firewall or server level.
3. Help the developer identify the cause of the resources problem
As hinted at above, if a crawl is causing problems there are likely issues somewhere. Whether this is as simple as straight up server resources or is more due to problems with the site and crawlable URLs needs to be determined but let me give you some pointers.
- SEO Audit - at least a crawl / indexation audit - lets see how many pages we can crawl? How does this stack up against the amount of products / categories? You may well find some easy wins here and sections of the site that can be blocked off or variables you tell Google not to crawl in webmaster tools. Nofollow directives and URLs can be your friend here as well so you tackle it on both fronts.
- Magento Optimisation - it is easy with a system like Magento to create pages that have a heavy burden on the database with hundreds of queries. If these options are not really used (only by crawlers) then they can be audited and removed / improved.
- Server Resources - Magento can be a hungry beast
- Dig into the http access logs to identify who and what is crawling and from where and come up with a list of what you need to block and how.
Summary
Ultimately, blocking all spiders is daft and there is a good chance it won't resolve the issue anyway - that is unless it screws over the clients search visibility so badly that they don't do any traffic! There are likely issues though be that with the site itself or something else so a good way to couch this to them is as their friend and helper - someone who will help them identify and resolve the issues. If it gets combative then it will only be harder to resolve.
Alternatively, you could move to another host. Part of me would suggest doing this anyway as no host should be able to hold you to ransom like this. This one daft move could have potentially ruined the clients visibility in what is a key time of the year for most online businesses. Imagine if they did not have an SEO on board? If they did not have an automated crawl to highlight these issues?
There is certainly a worthwhile exercise here as the site likely has some problems (or at least areas that can be improved upon) so optimisations can be made but, I would still consider jumping ship and moving to an SEO savvy host in the long term if bridges can't be built.
Hope that helps!
MarcusReferences
-
Yeah, what he said...
And when you call, let them know that it was their slow ssa server that caused you to find another host.
-
I agree with your assessment.
This hosting service is being run by either noobs or stingy people or both.
I would get a new host right away. ASAP. Your rankings in search will die completely if you remain on this host.
In addition to what you have seen here they probably have other practices that are deadly.
I would install my site on new server, then change the DNS before informing the current host. Then call to cuss 'em out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big SEO Changes
Firstly, this is quite extensive so thank you to anyone who answers some or all of the below! So this is quite a lengthy ordeal, and I'm going to start by saying that I'm no SEO expert (yet). I've paid for SEO for years and only on the odd occasion has it made any real difference. It has come to the point now where I've spent so much money on SEO over the years with practically no benefit that I can't afford to do it anymore, so I am teaching myself. So, back in July my website was hacked for a total sum of three weeks. My SEO/Hosting company at the time didn't do anything about it, let the hack sit there and didn't even take the site offline. It just so happened that at the time I was changing over to a new site at the time anyway, so I launched the new site (completely different in structure to the old one), did all of the relevant 301 redirects, and my traffic hasn't recovered since. I have gone from around 100-150 daily visits to 0-10. The descriptions, keywords, alt image tags, h1 & 2, meta data, etc. is all much better (a lot of it was empty on the previous site) on the new site than it was on the previous site so I was assuming it would be better, but it isn't. Anyone got any suggestions as to why this might be? Here are some specific questions: Canonical Problem? My site is ecommerce and lists some products in several categories, that has resulted in a high duplicate content rate. Is it expected/accepted by google that this would be the case for an ecommerce website or do I need to sort out some serious canonical urls to fix the issue? The site structure of my website could also be a problem, but I'm not qualified enough to know for sure. If you view a product/sub-category, then remove the category section of that link, the product will still appear. I don't know if this structure is good or not? i.e. if you click both links below, the link will appear all the same. http://thespacecollective.com/space-clothing/nasa-and-space-t-shirts http://thespacecollective.com/nasa-and-space-t-shirts Is this a problem for SEO? Duplicate Product Tag Problem? I have many duplicate product tags appearing on many products, should these be blocked in the robots.txt? i.e. http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown-cm-meteorite-acrylic http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown Site Code Structure When choosing the template I would use for my website I did not stop to consider if the code was SEO friendly, this on my part was due to my ignorance on the subject. Is the site structure SEO-friendly or is it hindering my efforts? Website: http://thespacecollective.com Again, thank you to anyone who takes the time to read/care about the issues facing a newbie. My only option now is to learn SEO myself (which is well overdue), so any advice/answers are appreciated!
Technical SEO | | moon-boots0 -
What are the things that need update? before Appying SEO
Hello Everyone, I need Start work on for this site www.tajsigma.com, I want to know Did i need Add any things for Good ranking. what are the things that need update in This site For SEO??
Technical SEO | | falguniinnovative
i know Basic Things robots, title, discription , But i want know Any Additional Things that I need to apply?? can anyone help, please Thanx in advance0 -
Google Disavow Tool
Some background: My rankings have been wildly fluctuating for the past few months for no apparent reason. When I inquired about this, many people said that even though I haven't received any penalty notice, I was probably affected by penguin. (http://moz.com/community/q/ranking-fluctuations) I recently did a link detox by LinkRemovalTools and it gave me a list of all my links, 2% were toxic and 51% were suspiscious. Should I simply disavow the 2%? There are many sites where is no contact info.
Technical SEO | | EcomLkwd0 -
SEO URLs?
What are the best practices for generating SEO-friendly headlines? dashes between words? underscores between words? etc. Looking for a programatically generated solution that's using editor-written headlines to produce an SEO-friendly URL Thanks.
Technical SEO | | ShaneHolladay0 -
I accidentally blocked Google with Robots.txt. What next?
Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com
Technical SEO | | Webmaster1230 -
Animated Video and Google Ranking, need advice
Hi I will like to have a company create animated video on our home page to offer a better experience in our online store selling furnace filters and increase conversion. Can it hurt ranking? Loading Time? File format? All thing I should worry. But what about videos like this company create: http://webvideo.kickerinc.ca/explainer-videos/?gclid=CJK68770grACFYje4Aod6kR5jA# Do you have recommendation or reference about serious company producing videos animated to had on web site? Thjank you for your help, BigBlaze
Technical SEO | | BigBlaze2050 -
Is this against google rules
Hi i am wanting to know if this is against google rules. I am building a website which will have lots of different sections and i wanted to know if you were allowed to have a new domain name pointing to a section of the site. so for example if i had a site with a domain name of manchester and then i wanted a section of the site to be called www.manchester.com/complimentary health I want to know if to help with traffic to the site and to have a better domain name, if it was allowed to have a new domain name pointing to that section of the site which could be called www.complimentaryhealth.com and have that pointing to the section. would love to hear your thoughts on this
Technical SEO | | ClaireH-1848860 -
Google Shopping Australia/Google Merchant Centre
So Google Shopping has finally landed in Australia so we've got some work todo hooking it up to our client ecom sites. Right now we have a handful of clients who are setup, the feed is getting in their ok but all products are sitting in "disapproved" status in the dashboard and clicking into each individual product the status says awaiting review. I logged a support ticket with Google to get some more info on this as it doesn't look right to me (ie the disapproved status in dashboard) and got a useless templated answer. Seems that if I switch the country destination to US the products are approved and live in google.com shopping search within the hour. Switch back to Australia and they go back to disapproved status. Anyone having the same issue/seen this before? I simply don't trust Google support and wondering if there's other factors at play here.
Technical SEO | | Brendo0