Getting bool from C to C++ and back

C++C

C++ Problem Overview


When designing data structures which are to be passed through a C API which connects C and C++ code, is it safe to use bool? That is, if I have a struct like this:

struct foo {
  int bar;
  bool baz;
};

is it guaranteed that the size and meaning of baz as well as its position within foo are interpreted in the same way by C (where it's a _Bool) and by C++?

We are considering to do this on a single platform (GCC for Debian 8 on a Beaglebone) with both C and C++ code compiled by the same GCC version (as C99 and C++11, respectively). General comments are welcome as well, though.

C++ Solutions


Solution 1 - C++

C's and C++'s bool type are different, but, as long as you stick to the same compiler (in your case, gcc), it should be safe, as this is a reasonable common scenario.

In C++, bool has always been a keyword. C didn't have one until C99, where they introduced the keyword _Bool (because people used to typedef or #define bool as int or char in C89 code, so directly adding bool as a keyword would break existing code); there is the header stdbool.h which should, in C, have a typedef or #define from _Bool to bool. Take a look at yours; GCC's implementation looks like this:

/*
 * ISO C Standard:  7.16  Boolean type and values  <stdbool.h>
 */

#ifndef _STDBOOL_H
#define _STDBOOL_H

#ifndef __cplusplus

#define bool        _Bool
#define true        1
#define false        0

#else /* __cplusplus */

/* Supporting <stdbool.h> in C++ is a GCC extension.  */
#define _Bool        bool
#define bool        bool
#define false        false
#define true        true

#endif /* __cplusplus */

/* Signal that all the definitions are present.  */
#define __bool_true_false_are_defined        1

#endif        /* stdbool.h */

Which leads us to believe that, at least in GCC, the two types are compatible (in both size and alignment, so that the struct layout will remain the same).

Also worth noting, the Itanium ABI, which is used by GCC and most other compilers (except Visual Studio; as noted by Matthieu M. in the comments below) on many platforms, specifies that _Bool and bool follow the same rules. This is a strong garantee. A third hint we can get is from Objective-C's reference manual, which says that for Objective-C and Objective-C++, which respect C's and C++'s conventions respectively, bool and _Bool are equivalent; so I'd pretty much say that, though the standards do not guarantee this, you can assume that yes, they are equivalent.

Edit:

If the standard does not guarantee that _Bool and bool will be compatible (in size, alignment, and padding), what does?

When we say those things are "architecture dependent", we actually mean that they are ABI dependent. Every compiler implements one or more ABIs, and two compilers (or versions of the same compiler) are said to be compatible if they implement the same ABI. Since it is expected to call C code from C++, as this is ubiquitously common, all C++ ABIs I've ever heard of extend the local C ABI.

Since OP asked about Beaglebone, we must check the ARM ABI, most specifically the GNU ARM EABI used by Debian. As noted by Justin Time in the comments, the ARM ABI indeed declares C++'s ABI to extend C's, and that _Bool and bool are compatible, both being of size 1, alignment 1, representing a machine's unsigned byte. So the answer to the question, on the Beaglebone, yes, _Bool and bool are compatible.

Solution 2 - C++

The language standards say nothing about this (I'm happy to be proven wrong about this, I couldn't find anything), so it can't be safe if we just limit ourselves to language standards. But if you're picky about which architectures you support you can find their ABI documentation to see if it will be safe.

For example, the amd64 ABI document has a footnote for the _Bool type that says:

> This type is called bool in C++.

Which I can't interpret in any other way than that it will be compatible.

Also, just musing about this. Of course it will work. Compilers generate code that both follow an ABI and the behavior of the largest compiler for the platform (if that behavior is outside the ABI). A big thing about C++ is that it can link to libraries written in C and a thing about libraries is that they can be compiled by any compiler on the same platform (this is why we have ABI documents in the first place). Can there be some minor incompatibility at some point? Sure, but that's something you'd better solve by a bug report to the compiler maker rather than workaround in your code. I doubt bool would be something compiler makers would screw up.

Solution 3 - C++

The only thing the C standard says on _Bool :

> An object declared as type _Bool is large enough to store the values 0 > and 1.

Which would mean that _Bool is at least sizeof(char) or greater (so true / false are guaranteed to be storable).

The exact size is all implementation defined as Michael said in the comments though. You're better off just performing some tests on their sizes on the relevant compiler and if those match and you stick with that same compiler I'd consider it's safe.

Solution 4 - C++

As Gill Bates says above, you do have a problem that sizeof(bool) is compiler-dependent in C. There's no guarantee that the same compiler will treat it the same in C and C++, or that they would be the same on different architectures. The compiler would even be within its rights (according to the C standard) to represent this as an individual bit in a bitfield if it wanted.

I've personally experienced this when working with the TI OMAP-L138 processor which combines a 32-bit ARM core and a 32-bit DSP core on the same device, with some shared memory accessible by both. The ARM core represented bool as an int (32-bit here), whereas the DSP represented bool as char (8-bit). To solve this, I defined my own type bool32_t for use with the shared memory interface, knowing that a 32-bit value would work for both sides. Of course I could have defined it as an 8-bit value, but I considered it less likely to affect performance if I kept it as the native integer size.

If you do the same as I did then you can 100% guarantee binary compatibility between your C and C++ code. If you don't then you can't. It's really as simple as that. With the same compiler, your odds are very good - but there is no guarantee, and changing compiler options can easily screw you over in unexpected ways.

On a related subject, your int should also be using int16_t, int32_t or another integer of defined size. (You should include stdint.h for these type definitions.) On the same platform it is highly unlikely that this will be different for C and C++, but it is a code smell for firmware to use int. The exception is in places where you genuinely don't care how long an int is, but it should be clear that interfaces and structures must have that well-defined. It is too easy for programmers to make assumptions (which are frequently incorrect!) about its size, and the results are generally catastrophic when it goes wrong - and worse, they often don't go wrong in testing where you can easily find and fix them.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionsbiView Question on Stackoverflow
Solution 1 - C++paulotorrensView Answer on Stackoverflow
Solution 2 - C++ArtView Answer on Stackoverflow
Solution 3 - C++Hatted RoosterView Answer on Stackoverflow
Solution 4 - C++GrahamView Answer on Stackoverflow