BOOL with 64-bit on iOS

IosObjective C

Ios Problem Overview


When I use BOOL for 32-bit, I get:

BOOL b1=8960; //b1 == NO
    
bool b2=8960; //b2 == true  

But for 64-bit, I get:

BOOL b1=8960; //b1 == YES

bool b2=8960; //b2 == true

What has changed about BOOL from 32-bit to 64-bit?

Ios Solutions


Solution 1 - Ios

@TimBodeit is right, but it doesn't explain why ...

BOOL b1=8960; //b1 == NO

... evaluates to NO on 32-bit iOS and why it evaluates to YES on 64-bit iOS. Let's start from the same beginning.

ObjC BOOL definition

#if (TARGET_OS_IPHONE && __LP64__)  ||  (__ARM_ARCH_7K__ >= 2)
#define OBJC_BOOL_IS_BOOL 1
typedef bool BOOL;
#else
#define OBJC_BOOL_IS_CHAR 1
typedef signed char BOOL; 
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" 
// even if -funsigned-char is used.
#endif

For 64-bit iOS or ARMv7k (watch) it's defined as bool and for the rest as signed char.

ObjC BOOL YES and NO

Read Objective-C Literals, where you can find:

> Previously, the BOOL type was simply a typedef for signed char, and > YES and NO were macros that expand to (BOOL)1 and (BOOL)0 > respectively. To support @YES and @NO expressions, these macros are > now defined using new language keywords in <objc/objc.h>:

#if __has_feature(objc_bool)
#define YES __objc_yes
#define NO  __objc_no
#else
#define YES ((BOOL)1)
#define NO  ((BOOL)0)
#endif

> The compiler implicitly converts __objc_yes and __objc_no to (BOOL)1 > and (BOOL)0. The keywords are used to disambiguate BOOL and integer > literals.

bool definition

bool is a macro defined in stdbool.h and it expands to _Bool, which is a boolean type introduced in C99. It can store two values, 0 or 1. Nothing else. To be more precise, stdbool.h defines four macros to use:

/* Don't define bool, true, and false in C++, except as a GNU extension. */
#ifndef __cplusplus
#define bool _Bool
#define true 1
#define false 0
#elif defined(__GNUC__) && !defined(__STRICT_ANSI__)
/* Define _Bool, bool, false, true as a GNU extension. */
#define _Bool bool
#define bool  bool
#define false false
#define true  true
#endif

#define __bool_true_false_are_defined 1

_Bool

_Bool was introduced in C99 and it can hold the values 0 or 1. What's important is:

> When a value is demoted to a _Bool, the result is 0 if the value > equals 0, and 1 otherwise.

Now we know where this mess comes from and we can better understand what's going on.

64-bit iOS || ARMv7k

BOOL -> bool -> _Bool (values 0 or 1)

Demoting 8960 to _Bool gives 1, because the value doesn't equal 0. See (_Bool section).

32-bit iOS

BOOL -> signed char (values -128 to 127).

If you're going to store int values (-128 to 127) as signed char, the value is unchanged per C99 6.3.1.3. Otherwise it is implementation defined (C99 quote):

> Otherwise, the new type is signed and the value cannot be represented > in it; either the result is implementation-defined or an > implementation-defined signal is raised.

It means that clang can decide. To make it short, with the default settings, clang wraps it around (int -> signed char):

  • -129 becomes 127,
  • -130 becomes 126,
  • -131 becomes 125,
  • ...

And in the opposite direction:

  • 128 becomes -128,
  • 129 becomes -127,
  • 130 becomes -126,
  • ...

But because signed char can store values in the range -128 to 127, it can store 0 as well. For example 256 (int) becomes 0 (signed char). And when your value 8960 is wrapped around ...

  • 8960 becomes 0,
  • 8961 becomes 1,
  • 8959 becomes -1,
  • ...

... it becomes 0 when stored in signed char (8960 is a multiple of 256, 8960 % 256 == 0), thus it's NO. The same applies to 256, 512, ... multiples of 256.

I strongly recommend using YES, NO with BOOL and not relying on fancy C features like int as a condition in if, etc. That's the reason Swift has Bool, true, and false and you can't use Int values in conditions where Bool is expected. Just to avoid this mess ...

Solution 2 - Ios

For 32-bit BOOL is a signed char, whereas under 64-bit it is a bool.


Definition of BOOL from objc.h:

/// Type to represent a boolean value.
#if (TARGET_OS_IPHONE && __LP64__)  ||  TARGET_OS_WATCH
#define OBJC_BOOL_IS_BOOL 1
typedef bool BOOL;
#else
#define OBJC_BOOL_IS_CHAR 1
typedef signed char BOOL; 
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" 
// even if -funsigned-char is used.
#endif

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionweijia.wangView Question on Stackoverflow
Solution 1 - IoszrzkaView Answer on Stackoverflow
Solution 2 - IosTim BodeitView Answer on Stackoverflow