My program is a CLI network client (my first network project), and so far I have been able to solve all the problems that I had. It decodes and writes 100% of the data correctly.
However, after starting up for an hour or two (it has changed), it crashes because Node is down. I suspect that the use of Node buffers is a source of leaks because they are allocated outside the V8 heap, but as far as I know, there is no way to control their GC. I use buffers very heavily in my syntax code, and this Node Socket "data" event throws.
Nothing else is happening right now, which could cause a leak. In addition, it seems that memory usage remains ~ 10M for the first hour and a half or so, then it starts to grow rapidly.
My questions are: are there any ways to solve buffer leaks? Are there any quirks with buffers that I should be aware of? Are there any obvious lines of action?
source share