Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
-
I would like some third party input... partly for my sanity and also for my client.
I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center)
Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site.
To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google)
Their keyword rankings are taking a quick dive as well.
Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host.
I believe:
1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used)
2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server.
3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear.
Please give your input and thoughts. I really appreciate it!
-
Thanks so much for your response. Glad to hear that there was a fairly good ending to this, and thanks for following up!
-
Keri -
I was able to produce multiple reports that accomplished the following:
1.) Illustrate the quick/graphic drop in our Google rankings
2.) Illustrate that the majority of traffic comes from Organic Search
3.) Tie together the trend that was already happening of a dramatic drop in organic traffic as keywords were slipping.
4.) Bring a reality to the fact that this behavior will quickly result in such a steep financial hit it was an 'emergency'.
In this particular situation there is My Client, Their Developers, Me the SEO guy.
I found that by helping my client to understand the situation, financial impacts, and why we had to act helped to spur things along.
We had a big meeting with everybody involved. This was a great opportunity to understand what the developers (Who also serve as the host) were going through. Facing attacks from bots, trying to keep the server alive, etc. etc. Ultimately it was revealed that they had a bug in their code for Magento that was causing a lot of extra DB hits that was a main root cause of their issues.
We were able to work out the following ground rules:
1.) NEVER block all bots under any circumstance
2.) The majority of our Organic traffic is Google then Bing/Yahoo were a tiny fraction and everything else didn't matter. I crafted a good robots.txt that let in all of the major bots I wanted and excluded the rest. Ideally I'd like to include most all of them. (Since we're only blocking good bots because the bad ones will just ignore) However, I wanted to compromise and also help them with server traffic. (PLUS for us Google is it.) I did make sure my robots.txt allowed in all Google services, etc.
3.) We set up a system to make sure everybody was in the loop when a dramatic decision regarding the website was made. (that's way better than me finding out a few days later that Google was blocked and damage has already been done)
4.) We really brought into light that SEO has/had nothing to do with the situation.
In the end the developers are great people but like everything else... they almost need to see you in person and hear why they can't do stuff like that. In their world it makes total sense because the server is overloaded. However, there won't be an overloaded server if you block out Google and all the traffic it sends.
We were able to recover most of our rankings and our traffic returned back to normal. We aren't quite back to where we were but getting there. The keywords snapped back fairly quickly but the organic traffic didn't so it might be something else. I actually will throw in a screenshot of the incident down below.
Thanks for checking up on it!
-
Hi Joshua, I'm looking through some older threads, and wondering if you're able to give us any type of update as to what happened in this case (and if you have any hair left!). I've had some battles with developers before too, and have sympathy for your position.
-
Thanks for your answers and help everyone! I really appreciate all of the details. I see the power of this community and hope to be able to contribute instead of only take in the future. Thanks again!!
-
Drop them as a client. They're paying you for SEO help but they obviously don't trust/like it. Not worth your time.
-
Quite clearly, this is bonkers.
If you block access from search engine spiders how can they possibly index the content in good faith? You are hoping that they will not crawl the content to check what is there (or burn server resources) but they will still happily refer users of their search engine to these pages in good faith.
Additionally, it is highly unlikely that bots from the major engines are causing a measurable impact - Google for instance states they will only crawl one page every few seconds (1).
That said, there are a lot of parasites out there and crawlers that will eat up server time so there may also be some truth in what the host is saying. That said, there is still no excuse for this hatchet job of sorting things out.
The other angle here is that magento and ecommerce sites can often be a crawlers worst nightmare. As an example product comparison systems can often create thousands (I have seen millions) of crawlable URLs - now a sensible spider has a crawl budget and will give up but that's not saying all will. A simple crawl in screaming frog should give you an idea here (not that you will be able to do that) and in many cases where these problems exist this is enough to bring a server to it's knees.
In my mind you have a few things to do here
1. Convince the host that blocking all spiders is incorrect
Hopefully this thread and the references here should be more than enough to do this. Beyond that simply show them a fetch as Googlebot & the Crawl section in webmaster tools and you should be able to make your point quickly and easily.
2. Help the developer implement a more sensible list of what to block.
This article is a good start here:
http://searchenginewatch.com/article/2067357/Bye-bye-Crawler-Blocking-the-ParasitesRemember you can allow one (or more) robots and then disallow everything else:
User-agent: Google Disallow: User-agent: * Disallow: /
Other options also exist such as limiting the speed at which a crawler can crawl - well, requesting that they limit the speed at which they crawl.
Also remember that any truly parasitic bot or crawler will likely ignore robots.txt anyway so you may need to implement some more advanced blocking at a firewall or server level.
3. Help the developer identify the cause of the resources problem
As hinted at above, if a crawl is causing problems there are likely issues somewhere. Whether this is as simple as straight up server resources or is more due to problems with the site and crawlable URLs needs to be determined but let me give you some pointers.
- SEO Audit - at least a crawl / indexation audit - lets see how many pages we can crawl? How does this stack up against the amount of products / categories? You may well find some easy wins here and sections of the site that can be blocked off or variables you tell Google not to crawl in webmaster tools. Nofollow directives and URLs can be your friend here as well so you tackle it on both fronts.
- Magento Optimisation - it is easy with a system like Magento to create pages that have a heavy burden on the database with hundreds of queries. If these options are not really used (only by crawlers) then they can be audited and removed / improved.
- Server Resources - Magento can be a hungry beast
- Dig into the http access logs to identify who and what is crawling and from where and come up with a list of what you need to block and how.
Summary
Ultimately, blocking all spiders is daft and there is a good chance it won't resolve the issue anyway - that is unless it screws over the clients search visibility so badly that they don't do any traffic! There are likely issues though be that with the site itself or something else so a good way to couch this to them is as their friend and helper - someone who will help them identify and resolve the issues. If it gets combative then it will only be harder to resolve.
Alternatively, you could move to another host. Part of me would suggest doing this anyway as no host should be able to hold you to ransom like this. This one daft move could have potentially ruined the clients visibility in what is a key time of the year for most online businesses. Imagine if they did not have an SEO on board? If they did not have an automated crawl to highlight these issues?
There is certainly a worthwhile exercise here as the site likely has some problems (or at least areas that can be improved upon) so optimisations can be made but, I would still consider jumping ship and moving to an SEO savvy host in the long term if bridges can't be built.
Hope that helps!
MarcusReferences
-
Yeah, what he said...
And when you call, let them know that it was their slow ssa server that caused you to find another host.
-
I agree with your assessment.
This hosting service is being run by either noobs or stingy people or both.
I would get a new host right away. ASAP. Your rankings in search will die completely if you remain on this host.
In addition to what you have seen here they probably have other practices that are deadly.
I would install my site on new server, then change the DNS before informing the current host. Then call to cuss 'em out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Bot is seeing the desktop version in cache for Mobile website too.
Hi, I have an e-commerce website that is dynamic(not responsive) in nature. when i check the cached version of mobile website it shows the desktop version in cache. Will it create any problem . How can i tell google bot to see my mobile cached version instead of desktop one.
Technical SEO | | dhananjay.kumar10 -
Top Landing Page has disappeared from Google Search Console but still shows at the top of Google
Hi, One of the top landing pages in my website has disappeared from GSC Search Analytics results. But I do get a good traffic from that page until now. What might be the reason for GSC to stop showing it on the results?
Technical SEO | | eduapps0 -
Wordpress site, combine Blog without hurting SEO - Need Expert Advice
Hi, I come from the old html days of Frontpage and then moved to Dreamweaver. I first worked with Wordpress at version 2.7 and was not all that impressed, but then recently I worked in the new version and was extremely impressed. So my knowledge of Wordpress is VERY limited and plan to build future sites with it. I need to know the best way to solve an issue for a customer. The client is http://www.nextgenrestoration.com/ Site was built years ago with Frontpage. The popularity of Blogs was hot so someone told them that if they add new content it would be better to use a blog, so they added a blog. So you have the following: www.nextgenrestoration.com (main site) then they installed wordpress in a folder (blog) www.nextgenrestoration.com/blog Original person that built the site quit. New person took over and said the main site needed to changed to Wordpress because they did not have Frontpage and all they knew was Wordpress. Main site was converted to Wordpress. They wanted to keep the original design so they did not use a stock template, they just built it with their design. I guess from looking at the Editor, they manually went in and put the design in to match. Now.. this last month, the person that had changed
Technical SEO | | Force7
the site to Wordpress quit. So I got involved because the new person they hired could not add content to the main website. If you add a page, it does not show up, you have to manually go in the php and add the link to the category. The new person knows how to use Wordpress but she knows nothing about PHP so is lost when it comes to manually adding content to the site. Here was my Thoughts. The main site needs to be rebuilt in a stock template so it automatically creates new pages, blog posts. I have to make sure that if we change the
main website that we could keep all the same links and page names. The girl
that built the site, if you hover over the links that she put it under ‘florida’,
that must be a category. But we would need to keep the same page names. I know
we could do a 301 redirect but this guy cannot lose traffic. He is already down
in hits after the last Panda update. My thought was, rebuild the main site in a stock template so
someone can actually add content easily to the site. Also build a new blog
section so it all matches. (personally the existing design looks old and dated and needs updating) If you look at the site now. The blog looks totally
different and it is not helping if a customer comes to the blog but cannot see
the navigation for the whole site. My thought was to just leave the old blog, it has a LOT of backlinks. But just add a new blog to the main site and all new content goes there. The old blog would stay just make sure we did build in some call to action so it sends them to the main site. Also, we found we cannot create a Blog on the
wordpress we have installed in the main directory. I am guessing because it
wants to name it /blog? I want to be sure we give this client the best advice on what to do without
hurting his existing seo and traffic. As you can tell, I am not qualified to really give the best advice since I am so new to Wordpress. This is a small company that really needs some help. Thanks in advance for your time! Force70 -
Not ranking well in Google
Hi, I am new to Seomoz,I have some little doubts regarding <title>tag.</p> <p>Can i target 3 words in the title tag. Currently i am on top for one keyword, and i cant get the rest two in top positions. Here is my website, can anyone review my site please.</p> <p>xxx(dot)ridpiles(dot)com with keyword hemorrhoids treatment</p> <p>I have good amount of backlinks, but still something i am missing. I have 100% unique content.</p> <p> </p> <p>Regards</p></title>
Technical SEO | | Dexter22387874870 -
Wordpress plugins for SEO
Hello I am new to wordpress I just have started using it. Can anyone suggest me some useful tools / plugins / setting for SEO? I am further intrested in sepeeding up Wp. Any good advive on wp an seo would be appriciated.
Technical SEO | | sesertin0 -
Google Webmaster Creation
When creating Google Webmaster Account is it advised to create 2 accounts for the 1 domain. one for non www and one for www?
Technical SEO | | daracreative0 -
Geotargeting by IP and SEO
Hi, Part of our site displays localized results based on the user's IP (we get the zipcode based on IP). For example a user in NY would get a list of NY based stores, while a user in CA would get a list of CA based stores. So if CA Googlebot comes to our site, it will get results based on Mountain View CA. Given the pages are generated based on your zip, I'm not sure how we'd indicate to Google that we have results for lots of locations and not just the Googlebot IP locations. (users can change their zipcode, but by default we use geolocation). Our landing pages contain localized content and unique urls with the zipcode etc, but it isn't clear how Google will find results for KY etc.
Technical SEO | | NicB10 -
A client will be translating their entire site into French in addition to English. For SEO purposes, should I host it on the same domain or create its own dedicated domain?
The current site is a long-standing site with good authority and a good number of links. Thanks....
Technical SEO | | JamesBSEO0