Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
-
Hi,
We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers.
And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings?
Thanks
-
The additional massive complexity, expense, upkeep and risk of trying to run a separate server just for bots is nowhere near worth it, in my opinion. (Don't forget, you'd also have to build a system to replicate the content between each server every time content/code is added or edited. That replication process could well use more resources than the bots do!)
I'd say you'd be much better off using all those resources towards a more robust primary server and let it do it's job.
In addition, as Lesley says, you can tune GoogleBot, and can actually schedule Bing's crawl times in their Webmaster Tools. Though for me, I'd want the search engine bots to get in and index my site just as soon as they were willing.
Lastly, it's only a few minutes' work to source a ready-made blacklist of "bad bots" useragents that you can quickly insert into your htaccess file to completely block a significant number of the most wasteful and unnecessary bots. You will want to update such a blacklist every few months as the worst offenders regularly change useragents to avoid just such blacklisting.
Does that make sense as an alternative?
Paul
-
I second what Jonathan says, but I would also like to add a couple of things. One thing I would keep in mind is reserve power on your server. If you are running the server close enough to its maximum traffic limit where a bot would matter, I would upgrade the whole server. All it takes is one nice spike from somewhere like hacker news or reddit to take your site offline, especially if you are running close to the red.
From my understanding you can actually adjust how and when Google will crawl you site also, https://developers.google.com/search-appliance/documentation/50/help_mini/crawl_fullcrawlsched
-
I've never known search engine bots to be particularly troublesome and overload servers. However, there are a few things you could do:
1. Setup Caching
2. Setup something like Cloudflare which would be able to block other threats.
I cannot imagine you are intending to block google, bing etc as I would definitely advise against cloaking the site like that from Google.
Of course it is difficult to make any specific comment as I have no idea to the extent of the problem you are suffering from. But something like caching \ cloudflare security features will help alot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Hosting Multiple Websites Within The Same Server Space
Hi, So, I have a client who wants to host two websites (which you could refer to as sister sites) on the same hosting account. For some reason, I was under the impression that doing as much may be detrimental (for SEO purposes). Am I correct in thinking this? Can I get some back-up documentation or comments here? I look forward to hearing what you all have to say. Thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
Do searchs bot understand SEF and non SEF url as the same ones ?
I've jsut realized that since almost for ever I use to code first my website using the non sef for internal linkings. It's very convenient as I'm sure that what ever will be the final url the link will always be good. ex: website.com/component1/id=1 Before releasing the website I use extensions to make the url user friendly according the choosen strategy. ex: website.com/component1/id=1 -> website.com/article1.html But I just wondered if google consider both urls as the same ones or if it consider just as a 301 redirection. What do you think is the best to do ?
White Hat / Black Hat SEO | | AymanH0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0