Author Topic: BING killing my server  (Read 6729 times)

0 Members and 1 Guest are viewing this topic.

BING killing my server
« on: June 04, 2019, 03:51:39 AM »
Is there a way to prevent search engine crawlers such as BING to kill my server?
It has happened twice this week.
It took me a while to figure out what was going on, but when BING crawls one on my sites, the load average of the server goes to the roof. It is the same as having a denial of service attack as it crawls hundreds of pages at the same time.
Then the server goes back to normal but the bind service doesn't recover and remains down.
I have to connect to the server and start bind (named) manually.
I will install monit in the server to solve this problem but the crawler error will persist.
Any ideas in how to deal with it?
Thanks in advance.

Re: BING killing my server
« Reply #1 on: June 04, 2019, 11:47:52 AM »
they are killing websevers also, many issues recently because of that, maybe block ip ranges with csf

check connections
sh /scripts/net_show_connections

then block ip ranges, this ip is only example:
csf -d
VPS & Dedicated server provider with included FREE Managed support for CWP.

*** Don't allow that your server or website is down, choose hosting provider with included expert managed support for your CWP.

Re: BING killing my server
« Reply #2 on: May 23, 2021, 02:43:16 PM »
The best way to check number of spiders/crawlers to your server is this command:

cat /usr/local/apache/domlogs/*.log | awk -F\" '{print $6}' | sort | uniq -c | sort -n
cat /usr/local/apache/domlogs/specific-domain.log | awk -F\" '{print $6}' | sort | uniq -c | sort -n

Also you can firstly clean all of your logs and get "new/fresh" result:

Beware, this command truncate all logs to empty:
find /usr/local/apache/domlogs/ -type f -name "*.log" -exec truncate --size 0 {} \;

With this command you can check which logs will be deleted first:
find /usr/local/apache/domlogs/ -type f -name "*.log" -exec ls -if 0 {} \;