Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - idovecer

Pages: 1 [2]
How to / How to prevent bad bots (web crawlers) with mod security
« on: October 10, 2021, 11:07:05 AM »
I'm using apache + mod_security (with Comodo WAF rules):

1. Install mod_security
How to install here >
Optional: select Comodo WAF rules (I use this rules, CWPanel -> Security -> ModSecurity -> Select Comodo WAF )

2. Check what web crawlers are the most common on your server
Command to list top 100 agents on your apache:
#cat /usr/local/apache/domlogs/*.log | awk -F\" '{print $6}' | sort | uniq -c | sort -nr | head -100

Short wiki about web crawlers:

3. Add rules in modsecurity to prevent some web bots / web crawlers
Add rules below in file #/usr/local/apache/modesecurity-cwaf/custom_user.conf (this is file custom user conf file if you are using Comodo WAF rules)

Code: [Select]
SecRule REQUEST_HEADERS:User-Agent "@contains blexbot" "id:'1000000',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains semrushbot" "id:'1000001',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains ahrefsbot" "id:'1000002',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains dotbot" "id:'1000003',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains mj12bot" "id:'1000004',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains barkrowler" "id:'1000005',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"
SecRule REQUEST_HEADERS:User-Agent "@contains megaindex" "id:'1000006',t:none,t:lowercase,deny,nolog,msg:'BAD BOT - Detected and Blocked. '"

4. Reload apache
Reload apache to reload updated mod_security custom rules
#systemctl reload httpd.service

5. Check one of your domain logs
Check log to see if your rules are valid and working, you must get 403 response (403 forbidden error)
Example: #less /usr/local/apache/domlogs/
Code: [Select] - - [10/Oct/2021:13:00:08 +0200] "GET /page/ HTTP/1.1" 403 199 "-" "Mozilla/5.0 (compatible; SemrushBot/7~bl; +"

Apache / Re: Vulnerability apache 2.4.49 || (NVD)CVE-2021-41773
« on: October 06, 2021, 01:46:26 PM »
Yes, I also get similar notification from my VPS today.
In my opinion it is also best solution to wait for the CWP upgrade team for cwp-httpd 2.4.50, I hope it will be soon, in day or two.

Dovecot / Dovecot auto restart (watchdog)
« on: July 23, 2021, 07:03:01 AM »
Due to backup unfortunately the server was briefly left without space and at that point dovecot service stop in the night around 00:57 min.

The server had free space soon after the backup, and in the morning at 08:00am when I checked, there was currently 20GB of free space on the server, but of all the services only the dovecot was not running.

Does the centos webpanel have some watchdog that monitors services that don't work so that it can autostart them, and how come the dovecot isn't started?


Code: [Select]
dovecot STATUS
dovecot.service - Dovecot IMAP/POP3 email server
Loaded: loaded (/usr/lib/systemd/system/dovecot.service; enabled; vendor preset: disabled)
Active: failed (Result: exit-code) since Pet 2021-07-23 00:57:49 CEST; 7h ago

Jul 23 00:56:01 imap ( Error: open (/var/vmail/ failed: No space left on device

E-Mail / Re: CWP is duplicating domains lines in
« on: July 15, 2021, 07:43:20 AM »
Did it help?

I noticed that every night at 4:00am is generated, obviously via cron.
But there is maybe some bug in those creation because again there are double lines in and because of that isn't created. So again annoying warning in maillog:

"warning: database /etc/postfix/ is older than source file /etc/postfix/"

Anyone knoww what is creating this file?

E-Mail / Re: CWP is duplicating domains lines in
« on: July 13, 2021, 06:54:57 AM »
Yes, I also noticed that. I sent them a question, but they do not respond with a solution if you don't pay.
It isn't problem to pay to solve problems, but I noticed that when I pay they only solve my current problem on my server (temporary fix) not future problem or maybe even to fix a bug.
So you don't know what was the problem or is it fixed and you aren't sure that this problem will not occur again in few weeks or months.
Thatís why I tried to find a temporary fix myself.

Anyway, there are maybe some bugs in SSL creation and also with creation and those double lines.

I fixed today with manually checking what was the double lines in files, and then in web panel >
WebServer settings > SSL certificates select Admin services for first double entry in and turn on: FTP and CPANEL services. There is a bug because cpanel isn't added, just ftp, no matter, but that fix all doubles in and is created.

Anyway, just renew of SSL didn't help in first attempt. In 2nd I requested admin services cpanel and ftp.
For now it is fine.

E-Mail / Re: Maillog get error /etc/postfix/
« on: July 11, 2021, 10:53:58 AM »
I have the similar problem, but generating database isn't possible with this command
postmap /etc/postfix/

because I'm getting this:
postmap: warning: /etc/postfix/ duplicate entry: ""
postmap: warning: /etc/postfix/ duplicate entry: ""
postmap: warning: /etc/postfix/ duplicate entry: ""
postmap: warning: /etc/postfix/ duplicate entry: ""

And it is right, this file has really double entries, but why?
It is generated at night, so I guess there is some bug that is generating double lines.

Is this a bug?

Maybe your localhost/server is not allowed to send email.

and add permit_mynetworks to smtpd_recipient_restrictions example:

smtpd_recipient_restrictions =
    check_sender_access hash:/etc/postfix/sender_access

and add to /etc/postfix/mynetworks file

Reload postfix with:
systemtl reload postfix.service

Other / Re: BING killing my server
« on: May 23, 2021, 02:43:16 PM »
The best way to check number of spiders/crawlers to your server is this command:

cat /usr/local/apache/domlogs/*.log | awk -F\" '{print $6}' | sort | uniq -c | sort -n
cat /usr/local/apache/domlogs/specific-domain.log | awk -F\" '{print $6}' | sort | uniq -c | sort -n

Also you can firstly clean all of your logs and get "new/fresh" result:

Beware, this command truncate all logs to empty:
find /usr/local/apache/domlogs/ -type f -name "*.log" -exec truncate --size 0 {} \;

With this command you can check which logs will be deleted first:
find /usr/local/apache/domlogs/ -type f -name "*.log" -exec ls -if 0 {} \;

CentOS 7 Problems / Re: SSL and Email Problem Cpanel to CWP Migration
« on: February 14, 2021, 11:46:29 AM »
It is possibly working for you but not for
All you need to do is to add A record in your DNS.

Remove CNAME for www and add A record for your domain with your IP address.

www 14400 IN A %ip%

Request SSL again.

Pages: 1 [2]