Why does everybody typedef over standard C types?

C++CStdint

C++ Problem Overview


If you want to use Qt, you have to embrace quint8, quint16 and so forth.

If you want to use GLib, you have to welcome guint8, guint16 and so forth.

On Linux there are u32, s16 and so forth.

uC/OS defines SINT32, UINT16 and so forth.

And if you have to use some combination of those things, you better be prepared for trouble. Because on your machine u32 will be typedefd over long and quint32 will be typedefd over int and the compiler will complain.

Why does everybody do this, if there is <stdint.h>? Is this some kind of tradition for libraries?

C++ Solutions


Solution 1 - C++

stdint.h didn't exist back when these libraries were being developed. So each library made its own typedefs.

Solution 2 - C++

For the older libraries, this is needed because the header in question (stdint.h) didn't exist.

There's still, however, a problem around: those types (uint64_t and others) are an optional feature in the standard. So a complying implementation might not ship with them -- and thus force libraries to still include them nowadays.

Solution 3 - C++

stdint.h has been standardised since 1999. It is more likely that many applications define (effectively alias) types to maintain partial independence from the underlying machine architecture.

They provide developers confidence that types used in their application matches their project specific assumptions on behavior that may not match either the language standard or compiler implementation.

The practice is mirrored in the object oriented Façade design pattern and is much abused by developers invariably writing wrapper classes for all imported libraries.

When compliers were much less standard and machine architectures could vary from 16-bit, 18-bit through 36-bit word length mainframes this was much more of a consideration. The practice is much less relevant now in a world converging on 32-bit ARM embedded systems. It remains a concern for low-end microcontrollers with odd memory maps.

Solution 4 - C++

So you have the power to typedef char to int.

One "coding horror" mentioned that one companies header had a point where a programmer wanted a boolean value, and a char was the logical native type for the job, and so wrote typedef bool char. Then later on someone found an integer to be the most logical choice, and wrote typedef bool int. The result, ages before Unicode, was virtually typedef char int.

Quite a lot of forward-thinking, forward compatibility, I think.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionAmomumView Question on Stackoverflow
Solution 1 - C++Edward KarakView Answer on Stackoverflow
Solution 2 - C++VenView Answer on Stackoverflow
Solution 3 - C++PekkaView Answer on Stackoverflow
Solution 4 - C++Christos HaywardView Answer on Stackoverflow