GCC API cannot display native exported characters

I am trying to use GCC abi::__cxa_demangle to demonstrate the characters exported from an object file created by g++ . However, I always get an error

mangled_name is not a valid name in C ++ ABI rules mangling

This is what I call the function:

 std::string demangled(std::string const& sym) { std::unique_ptr<char, void(*)(void*)> name{abi::__cxa_demangle(sym.c_str(), nullptr, nullptr, nullptr), std::free}; return {name.get()}; } 

(Processing error omitted, present in full online demo .)

The characters I tested are derived from this small code:

 namespace foo { template <typename T> struct bar { bar() { } }; } void baz(int x) { } template struct foo::bar<int>; 

through g++ -c test.cpp; nm test.o | cut -d ' ' -f3 g++ -c test.cpp; nm test.o | cut -d ' ' -f3 g++ -c test.cpp; nm test.o | cut -d ' ' -f3 :

 EH_frame1 __Z3bazi __ZN3foo3barIiEC1Ev __ZN3foo3barIiEC2Ev 

Im not quite sure what the purpose of the GCC API is if it cannot decouple these characters - it can, however, successfully dismantle C ++ typeid . For instance. an entry in the test code typeid(foo::bar<int>*).name() will give PN3foo3barIiEE , which, in turn, is correctly demarcated by the above function.

Am I doing something wrong? How can I unmount exported characters from a GCC object file?

+6
source share
1 answer

Your characters have too many underscores in front. I'm not sure why, but if you check C ++ filtjs , it will report the same thing: they are not valid. Itanium ABI characters with two underscores in front, but with one. In this case, I would say that nm output is incorrect, and not that the demangle function is incorrect. Itanium ABI points out, and I know that Clang uses only one underscore.

You know, this really says something that I can almost read Itanium ABI distorted names. Too much time to read the output of Clang LLVM IR.

+3
source

Source: https://habr.com/ru/post/949127/


All Articles