Why does memset take an int instead of a char?

CMemset

C Problem Overview


Why does memset take an int as the second argument instead of a char, whereas wmemset takes a wchar_t instead of something like long or long long?

C Solutions


Solution 1 - C

memset predates (by quite a bit) the addition of function prototypes to C. Without a prototype, you can't pass a char to a function -- when/if you try, it'll be promoted to int when you pass it, and what the function receives is an int.

It's also worth noting that in C, (but not in C++) a character literal like 'a' does not have type char -- it has type int, so what you pass will usually start out as an int anyway. Essentially the only way for it to start as a char and get promoted is if you pass a char variable.

In theory, memset could probably be modified so it receives a char instead of an int, but there's unlikely to be any benefit, and a pretty decent possibility of breaking some old code or other. With an unknown but potentially fairly high cost, and almost no chance of any real benefit, I'd say the chances of it being changed to receive a char fall right on the line between "slim" and "none".

Edit (responding to the comments): The CHAR_BIT least significant bits of the int are used as the value to write to the target.

Solution 2 - C

Probably the same reason why the functions in <ctypes.h> take ints and not chars.

On most platforms, a char is too small to be pushed on the stack by itself, so one usually pushes the type closest to the machine's word size, i.e. int.

As the link in @Gui13's comment points out, doing that also increases performance.

Solution 3 - C

See fred's answer, it's for performance reasons.

On my side, I tried this code:

#include <stdio.h>
#include <string.h>

int main (int argc, const char * argv[])
{
    char c = 0x00;
    
    printf("Before: c = 0x%02x\n", c);
    memset( &c, 0xABCDEF54, 1);
    printf("After:  c = 0x%02x\n", c);
    
    return 0;
}

And it gives me this on a 64bits Mac:

Before: c = 0x00
After:  c = 0x54

So as you see, only the last byte gets written. I guess this is dependent on the architecture (endianness).

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionuser541686View Question on Stackoverflow
Solution 1 - CJerry CoffinView Answer on Stackoverflow
Solution 2 - CFrédéric HamidiView Answer on Stackoverflow
Solution 3 - CGui13View Answer on Stackoverflow