The number of characters used clearly depends on the value: if time_stamp_for_file_name is 0, you really need only 2 bytes. If in doubt, you can use snprintf , which tells you how much space you need:
int len = snprinf(0, 0, "%ld", (long)time_stamp_for_file_name) + 1; char *tmp = malloc(len); if (tmp == 0) { } snprintf(tmp, len, "%ld", (long)time_stamp_for_file_name);
Beware of implementations where snprintf returns -1 for insufficient space rather than the required space.
As Paul P says, you can define a fixed upper bound based on the size long for your implementation. Thus, you completely exclude dynamic allocation. For instance:
#define LONG_LEN (((sizeof(long)*CHAR_BIT)/3)+2)
(depending on the fact that the log of the 2nd base 10 is greater than 3). This +2 gives you 1 for the minus sign and 1 for the whole division to be rounded. You will need one more for the nul terminator.
Or:
#define STRINGIFY(ARG) #ARG #define EXPAND_AND_STRINGIFY(ARG) STRINGIFY(ARG) #define VERBOSE_LONG EXPAND_AND_STRINGIFY(LONG_MIN) #define LONG_LEN sizeof(VERBOSE_LONG) char tmp[LONG_LEN]; sprintf(tmp, "%ld", (long)time_stamp_for_file_name);
VERBOSE_LONG may be a slightly larger string than you really need. On my compiler this is (-2147483647L-1) . I'm not sure if LONG_MIN expand to something like a hexadecimal literal or an inline compiler, but if so, it might be too short and this trick won't work. However, it is simple enough for unit testing.
If you want a tight upper bound to cover all the possibilities within the standard, up to a certain limit, you could try something like this:
#if LONG_MAX <= 2147483647L #define LONG_LEN 11 #else #if LONG_MAX <= 4294967295L #define LONG_LEN 11 #else #if LONG_MAX <= 8589934591L ... etc, add more clauses as new architectures are invented with bigger longs #endif #endif #endif
But I doubt it is worth it: itโs better to simply define it in some portability header and manually configure it for new platforms.