Hi,
Is there a way to prevent search engine crawlers such as BING to kill my server?
It has happened twice this week.
It took me a while to figure out what was going on, but when BING crawls one on my sites, the load average of the server goes to the roof. It is the same as having a denial of service attack as it crawls hundreds of pages at the same time.
Then the server goes back to normal but the bind service doesn't recover and remains down.
I have to connect to the server and start bind (named) manually.
I will install monit in the server to solve this problem but the crawler error will persist.
Any ideas in how to deal with it?
Thanks in advance.