Using gdb memory for huge releases

When opening coredump with gdb, will gdb try to load the full coredump into memory?

I found a 35GB coredump on one of our intermediate systems. Our operating instructions call for the creation of a return line using gdb. I am afraid that gdb will try to load the full coredump into memory and render the intermediate recording area unusable using all available memory.

We use gdb 7.0.1 on a RedHat EL 5 / 64bit installation.

+4
source share
1 answer

When opening coredump with gdb, will gdb try to load the full coredump into memory?

No.

I am afraid that gdb will try to load the full coredump into memory and make the inaccessible area of ​​the intermediate record using all available memory.

Even without loading the entire core dump, GDB will consume some memory, so you can affect your intermediate system. If you cannot afford the effect, then you need to change your β€œwork procedures”, for example. move the kernel to another system and analyze it there. Beware: if your executable file uses dynamic linking, you must align the dynamic libraries exactly when the main dump was created and when it is parsed.

+3
source

Source: https://habr.com/ru/post/1389254/


All Articles