18,000 MySQL injection attempts per day: stopping attempts

This question is not about protection against SQL injection attacks. This question has been answered many times on StackOverflow, and I implemented these methods. It is about stopping attempts .

Recently, my site was hit by a huge number of injections. Right now, I'm trapping them and returning a static page.

This is what my URL looks like:

/products/product.php?id=1 

Here's what the attack looks like:

 /products/product.php?id=-3000%27%20IN%20BOOLEAN%20MODE%29%20UNION%20ALL%20SELECT%2035%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C%27qopjq%27%7C%7C%27ijiJvkyBhO%27%7C%7C%27qhwnq%27%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35--%20 

I know for sure that this is not just a bad link or a heavy print, so I do not want to send them to the overview page. I also do not want to use any resources on my site by providing static pages.

I am considering just copying the page with die() . Is there something wrong with this approach? Or is there an HTML return code that I can install with PHP that would be more appropriate?

Edit:

Based on a few comments below, I looked at how to return the page was not found. This icktoofay answer stack overflow assumes using 404 and then die (); - the bot thinks that there is no page and can even leave, and there are no more resources to display a message not found by the page.

 header("HTTP/1.0 404 Not Found"); die(); 
+6
source share
4 answers

Filtering probable deployment attempts is mod_security .

It may take quite a bit of work to configure it to recognize legitimate requests for your application.

Another common method is to block the IP addresses of malicious clients when they are detected.

+6
source

You can try to stop this traffic from accessing your server using hardware. Most devices that check packages can be useful. For this purpose I use F5 (among others). F5 has its own scripting language called iRules, which provides excellent control and customization.

+1
source

The message was unlocked, so I thought Id was sharing what Ive was doing to reduce attacks from the same ip address. I still get half a dozen a day, but they usually only once or twice from each ip address.

Note. To return a 404 error message, all of this should appear before any HTML is sent. Im using PHP and redirecting all errors to the error file.

 <?php require_once('mysql_database.inc'); // I'm using a database, so mysql_real_escape_string works. // I don't use any special characters in my productID, but injection attacks do. This helps trap them. $productID = htmlspecialchars( (isset($_GET['id']) ? mysql_real_escape_string($_GET['id']) : '55') ); // Product IDs are all numeric, so it's an invalid request if it isn't a number. if ( !is_numeric($productID) ) { $url = $_SERVER['REQUEST_URI']; // Track which page is under attack. $ref = $_SERVER['HTTP_REFERER']; // I display the referrer just in case I have a bad link on one of my pages $ip = $_SERVER['REMOTE_ADDR']; // See if they are comng from the same place each time // Strip spaces just in case they typed the URL and have an extra space in it $productID=preg_replace('/[\s]+/','',$productID); if ( !is_numeric($productID) ) { error_log("Still a long string in products.php after replacement: URL is $url and IP is $ip & ref is $ref"); header("HTTP/1.0 404 Not Found"); die(); } } 

I also have many pages on which I display different content depending on the selected category. In these cases, I have a number of if statements, such as if ($ cat == 'Speech') {} There is no database search, so there is no possibility of SQL injection, but I still want to stop the attacks, and not waste time displaying default page for bot. Typically, a category is a short word, so I modify the conditional expression is_numeric above to check the length of the string, for example. if (strlen ($ cat)> 10) . Since most of them have more than 10 characters, this works very well.

0
source

A very good +1 question from me and the answer is not simple.

PHP does not provide a way to maintain data for different pages and different sessions, so you cannot restrict access by IP address unless you store access information somewhere.

If you do not want to use a database connection for this, you can, of course, use the file system. I'm sure you already know how to do this, but here you can see an example:

 DL Script Archives http://www.digi-dl.com/ (click on "HomeGrown PHP s", then on "IP/networking", then on "View Source" for the "IP Blocker with Time Limit" section) 

The best option is "mod_throttle". Using this, you can limit each IP address to one access in five seconds by adding this directive to your Apache configuration file:

 <IfModule mod_throttle.c> ThrottlePolicy Request 1 5 </IfModule> 

But there is some bad news. The author of mod_throttle abandoned the product:

 "Snert Apache modules currently CLOSED to the public until further notice. Questions as to why or requests for archives are ignored." 

Another apache module, mod_limitipconn, is most commonly used these days. This does not allow arbitrary restrictions to be made (for example, "no more than ten requests in every 15 seconds"). All you can do is limit each IP address to a specific number of concurrent connections. Many webmasters seem to argue that this is a good way to combat bot spam, but it looks less flexible than mod_throttle.

You need different versions of mod_limitipconn, depending on which version of Apache you are using:

 mod_limitipconn.c - for Apache 1.3 http://dominia.org/djao/limitipconn.html mod_limitipconn.c - Apache 2.0 port http://dominia.org/djao/limitipconn2.html 

Finally, if your Apache server is hosted on a Linux machine, you can use a solution that does not require recompiling the kernel. Instead, it uses the iptables firewall rules. This method is quite elegant and flexible enough to impose restrictions, such as "no more than three connections from this IP in one minute." Here's how to do it:

 Linux Noob forums - SSH Rate Limit per IP http://www.linux-noob.com/forums/index.php?showtopic=1829 

I understand that none of these options will be perfect, but they illustrate what is possible. Perhaps using a local database will eventually get better? In any case, keep in mind that simply limiting the speed of requests or limiting the bandwidth does not solve the problem of bots. They may take longer, but they ultimately drain as much resources as they would if they had not slowed down. You must actually reject their HTTP requests, not just delay or distribute them.

Good luck escalating the battle between content and spam!

-1
source

Source: https://habr.com/ru/post/951824/


All Articles