What are some methods or tools for profiling excess code in C / C ++ applications?

I have a C ++ library that generates much larger code that I would really expect from what it does. From less than 50 thousand source lines, I get shared objects that are almost 4 MB, and static archives by pressing 9. This is problematic because the binary binaries of the library are quite large, and, even worse, even simple applications connecting it, usually get from 500 to 1000 KB in code size. Compiling a library with flags such as -Os helps with this, but not much.

I also experimented with the GCC -frepo team (although all the documentation I saw assumes that Linux collect2 will combine duplicate templates anyway) and explicitly creating a template on templates that seem to “probably” duplicate a lot, but without real effect anyway. Of course, I say “sooner” because, as with any profiling, blind guessing like this is almost always wrong.

Is there any tool that makes it easier to determine the size of the code, or in some other way I can figure out what takes up so much space or, more generally, any other things that I should try? Something that works under Linux would be ideal, but I will take what I can get.

+5
c ++ gcc linux code-size
Oct 13 '09 at 17:09
source share
3 answers

If you want to know what fits in your executable, refer to your tools. Turn on the ld linker --print-map (or -M) option to create a map file showing what it has placed in memory and where. Doing this for a static related example is probably more informative.

If you do not call ld directly, but only through the gcc command line, you can pass ld specific parameters to ld from the gcc command line by specifying them -Wl,

+7
Oct 13 '09 at 17:56
source share

On Linux, the linker certainly combines multiple instances of the templates.

Make sure you are not measuring debug binaries (debug information may take up more than 75% of the final binary size).

One method to reduce the final binary size is to compile with -ffunction-sections and -fdata-sections , then link with -Wl,--gc-sections .

Even a larger reduction (we saw 25%) may be possible if you use the development version of [gold][1] (the new ELF is the only linker, part of binutils) and the link with -Wl,--icf

Another useful method is to reduce the character set that is “exported” by your shared libraries (everything is exported by default), either through __attribute__((visibility(...))) or using the script linker. Details here (see "Export Control").

+2
Oct. 14 '09 at 3:12
source share

One very crude but very quick way is to see the size of your object files. Not all of the code in the object files will be compiled to the final binary, so there may be several false positives, but it can give a good idea of ​​where the hot spots will be. Once you find the largest object files, you can delve into them with tools such as objdump and nm .

+1
Oct 13 '09 at 20:51
source share



All Articles