hammering the server looking for a particular page on a customers site that no longer exists.
In 2 days, we noticed 3 IP's that have hit the same page and received a 404 error
over 740 thousand times.
Running a query such as:
Code: Select all
cat /usr/local/apache/logs/error_log | awk '{print $8}' | sort | uniq -c | sort -n
...
22353 xxx.xxx.x.x]
216689 xx.xxx.xx.xx]
501219 xx.xxx.xx.xxx]
The number on the left is the number of times that IP address on the right (designated by x's) has hit a certain page.
I'm wondering if there is a rule that can be created either csf (or even mod_security) that would look for 404 errors and if the number of 404 errors from a single IP address reaches let's say 1000, it blocks the IP?
I'll also ask the people at gotroot.com to see if they can come up with something.
Thanks.
Peter