Consider the following example:
The code compiles without compiler or linker errors. But the output is strange for me:
gcc (Red Hat 4.6.1-9) on Fedora x86_64 without optimization [ EG1 ]:
UsedClass 1 (0x7fff0be4a6ff) doit hit
UsedClass 1 (0x7fff0be4a72e) doit hit
the same as [EG1], but with the -O2 [ EG2 ] option turned on:
UsedClass 2 (0x7fffcef79fcf) UseClass 1 (0x7fffcef79fff) doit hit
msvc2005 (14.00.50727.762) on Windows XP 32bit without optimization [ EG3 ]:
UsedClass 1 (0012FF5B) UsedClass 1 (0012FF67) doit hit
similar to [EG3], but with / O 2 (or / Ox) included [ EG4 ]:
UsedClass 1 (0012FF73) doit hit
Used Class 1 (0012FF7F) doit hit
I expect either a linker error (assuming that the ODR rule is violated) or the result, as in [EG2] (the code is embedded, nothing is exported from the translation unit, the ODR rule is saved). So my questions are:
- Why are outputs [EG1], [EG3], [EG4] possible?
- Why do I get different results from different compilers or even from the same compiler? This makes me think that the standard somehow does not define behavior in this case.
Thank you for any suggestions, comments and standard interpretations.
Update
I would like to understand the behavior of the compiler. More precisely, why there are no errors that occur when ODR is violated. The hypothesis is that since all functions in the UsedClass1 and UsedClass2 classes are marked as inline (and therefore C ++ 03 3.2 is not violated), the linker does not report errors, but in this case outputs [EG1], [EG3], [EG4 ] seem strange.
source share