I am developing an application on a 64-bit Linux system. As I could see, there is too much dirty heap memory in my application. Speaking of a heap of memory, what does dirty mean? What makes it arise and what can be done to prevent its occurrence?
EDIT
I’ll better explain what operations my application performs.
My application runs in two threads: the first thread sends jobs to the queue, which are then executed in another thread. Thus, the first thread selects the pages that should be queued, and the second thread decompresses them, performs its tasks, and frees them. All these operations are performed in a thread-safe manner.
So, I took a test for this thing, putting a queue in 100000000 tasks and completing them all. Up to a point, memory usage increases. Then, when the priority process ends and only the remaining remains, the memory load does not inexplicably decrease. Finally, when all tasks are deleted and executed, all this memory is freed. So the memory leak seems to occur during the dequeuing process, because when it ends, all the memory is freed, but I did not find anything wrong in my code.
I know that it would be better if I posted my code here, but it is too large. But, from what I added, can anyone suggest what could be causing this?
source share