Getting bool from C to C ++ and vice versa

When designing data structures to be passed through the C API, which connects C and C ++ code, is it safe to use bool ? That is, if I have a struct like this:

 struct foo { int bar; bool baz; }; 

Is it guaranteed that the size and value of baz , as well as its position inside foo interpreted the same way with C (where a _Bool ) and C ++?

We plan to do this on the same platform (GCC for Debian 8 on Beaglebone) with C and C ++ code compiled with the same version of GCC (as C99 and C ++ 11, respectively). General comments are also welcome.

+46
c ++ c
Oct 13 '16 at 12:00
source share
4 answers

C and C ++ bool are different, but if you stick to the same compiler (in your case gcc), it should be safe, as this is a reasonable general scenario.

In C ++, bool has always been a keyword. C was not until C99, where they entered the keyword _Bool (since people used typedef or #define bool as int or char in C89 code, so directly adding bool as a keyword would violate the existing code); there is a stdbool.h header, which in C should have typedef or #define from _Bool to bool . Take a look at yourself; The implementation of the GCC is as follows:

 /* * ISO C Standard: 7.16 Boolean type and values <stdbool.h> */ #ifndef _STDBOOL_H #define _STDBOOL_H #ifndef __cplusplus #define bool _Bool #define true 1 #define false 0 #else /* __cplusplus */ /* Supporting <stdbool.h> in C++ is a GCC extension. */ #define _Bool bool #define bool bool #define false false #define true true #endif /* __cplusplus */ /* Signal that all the definitions are present. */ #define __bool_true_false_are_defined 1 #endif /* stdbool.h */ 

This makes us think that, at least in GCC, both types are compatible (both in size and in alignment, so that the structure structure remains the same).

It is also worth noting the Itanium ABI , which is used by GCC and most other compilers (except Visual Studio, as Matti M. noted in the comments below) on many platforms, indicates that _Bool and bool follow the same rules. This is a serious guarantee. The third hint we can get is given in the Objective-C reference manual, which states that for Objective-C and Objective-C ++, which relate to the conventions in C and C ++, respectively, bool and _Bool equivalent; therefore, I would say that although the standards do not guarantee this, you can assume that yes, they are equivalent.

Edit:

If the standard does not guarantee compatibility between _Bool and bool (in size, alignment, and padding), what does?

When we say that these things are “architecture dependent,” we actually mean that they are ABI dependent. Each compiler implements one or more ABIs, and two compilers (or versions of the same compiler) are considered compatible if they implement the same ABI. Since it is expected that C code will be called in C ++, as this is ubiquitous, all the C ++ ABIs I have ever heard of distribute the local C ABI.

Since the OP asked about Beaglebone, we should check out ARM ABI , in particular the GNU ARM EABI used by Debian. As Justin Time notes in the comments, ARM ABI does state that C ++ ABI extends C and that _Bool and bool compatible , both have a size of 1, alignment 1 representing an unsigned machine character. So the answer to the question is on Beaglebone, yes, _Bool and bool compatible .

+45
Oct 13 '16 at 12:19
source share

Language standards do not say anything about this (I'm glad that I was wrong about this, I could not find anything), so it cannot be safe if we restrict ourselves only to language standards. But if you are picky about which architectures you support, you can find their ABI documentation to make sure it is safe.

For example, amd64 ABI document has a footnote for type _Bool , which states:

This type is called bool in C ++.

Which I cannot interpret in any other way than it will be compatible.

Also, just thinking about it. Of course it will work. Compilers generate code that matches the ABI and the behavior of the largest compiler for the platform (if this behavior goes beyond the ABI). The great thing about C ++ is that it can reference libraries written in C, and as far as libraries are concerned, they can be compiled by any compiler on the same platform (which is why we have ABI documents in the first turn). Could there be some slight incompatibility at some point? Of course, but this is what you better solve by reporting the error report to the compiler developer, rather than a workaround in your code. I doubt that there will be something in bool that compiler creators will get corrupted.

+15
Oct. 13 '16 at 12:19
source share

The only thing the C standard says on _Bool :

An object declared as type _Bool is large enough to hold values ​​0 and 1.

This would mean that _Bool at least sizeof(char) or more (therefore true / false guaranteed to be preserved).

The exact size is the whole implementation, defined as Michael said in the comments. You better just run some tests on their sizes in the appropriate compiler, and if they match and you stick to the same compiler, I would consider it safe.

+6
Oct 13 '16 at 12:20
source share

As Gill Bates says, you have a problem that sizeof(bool) is compiler dependent. There is no guarantee that the same compiler will treat it the same way. Compilers often have the opportunity to reduce the length of the enumerations if the values ​​specified for the enumeration can be set to fewer bits, and for bool it would be unpredictable. The compiler, even within its rights (in accordance with the C standard), should represent this as a separate bit in the bit field, if it wants to.

I personally experienced this when working with the TI OMAP-L138 processor, which combines a 32-bit ARM core and a 32-bit DSP core on the same device with shared memory available to them. The ARM core represented bool as int (32-bit here), while the DSP represented bool as char (8-bit). To solve this problem, I defined my own type bool32_t for use with the shared memory interface, knowing that a 32-bit value would work for both sides. Of course, I could define it as an 8-bit value, but I thought that it was less affected by performance if I saved it as its own integer size.

If you do the same thing as me, then you can 100% guarantee binary compatibility between your C and C ++ code. If you do not, you cannot. It's really that simple. With the same compiler, your chances are very good - but there is no guarantee, and changing the compiler options may suddenly lead you back to normal.

For a related object, your int should also use int16_t , int32_t or another integer of a certain size. (You must enable stdint.h for these type definitions.) On the same platform, it is unlikely that this will be different for C and C ++, but it is the smell of firmware code for using int . The exception is where you really don't care how long the int works, but it should be clear that the interfaces and structures must have this clearly defined. It's too difficult for programmers to make assumptions (which are often incorrect!) About its size, and the results are usually disastrous when they go wrong - and, even worse, they often do not make mistakes in testing, where you can easily find and fix them. .

+5
Oct 13 '16 at 13:10
source share



All Articles