I have a hapijs app and checking some logs. I found some entries for automatic site scanners and hits in /admin.php entries and the like.
I found this wonderful article How to block automated scanners when crawling your site , and I thought it was great.
I am looking for guidance on what the best strategy is to create honey pots for the hapijs / nodejs application to identify suspicious requests, log them and possibly temporarily ban IP addresses.
Do you have any general or specific (for node and hapi) recommendations for its implementation?
My thoughts include:
- Create a honeypot route with a non-obvious name
- Add robots.txt to ban search engines on this route.
- Create route content (see article and discussion of some recommendations)
- Write to a special journal or mark journal entries for easy tracking and subsequent analysis.
- Perhaps some logic is created that if traffic from this IP address receives more traffic than a certain threshold (5 times access to the honeypot route denies the IP address for X hours or constantly)
A few questions that I have:
- How can you deny an IP address using hapi.js?
- Are there any other recommendations for identifying automatic scanners?
- Do you have specific suggestions for implementing honeypot?
Thank!
source
share