I want to concatenate a piece of text, for example "The answer is " with a signed integer, to give the output "The number is 42".
I know how long the piece of text is (14 characters) but I don't know how many characters the string representation of the number will be.
I assume the worst case scenario, the largest signed 16-bit integer has 5 digits, plus one extra in case it is negative, so is the following code the correct way to do it?
#include <stdio.h>
#include <stdlib.h>
int main()
{
char *message;
message = malloc(14*sizeof(char)+(sizeof(int)*5)+1);
sprintf(message, "The answer is %d", 42);
puts(message);
free(message);
}
I think that the correct formula to get the maximum lenght of the decimal representation of an integer would be (floor(log10(INT_MAX))+1); you could also abuse of the preprocessor in this way:
, which also avoids the heap allocation. You may want to use snprintf for better security, although with this method it shouldn't be necessary.
Another trick like that would be to create a function like this:
if the compiler is smart enough it could completely sweep away the dummy var from the compiled code, otherwise it may be wise to declare that var as static to avoid reinitializing it every time the function is called.
A safe approximation for signed
int
is (number of digits including the potential-
sign):The equivalent for
unsigned
is:This calculates the number of digits - add one to both of them to account for the terminator, if allocating space for a null-terminated string.
This will slightly overestimate the space required for very long types (and will also overestimate in the unusual case where
int
has padding bits), but is a good approximation and has the advantage that it is a compile-time constant.CHAR_BIT
is provided by<limits.h>
.malloc((14 + 6 + 1) * sizeof(char));
Note : Sizeof(int) gives you the size of the type in byes. Sizeof(int) == 4 if the int is 32bits, 8 if it's a 64bits.
Not quite, you only need a number of characters so
sizeof(int)
is not required.However, for easily maintainable and portable code, you should have something like:
This has a number of advantages:
malloc
adjusts automatically.sizeof(TEXT)+CHARS_PER_INT+1
is calculated at compile time. A solution involvingstrlen
would have a runtime cost.-32768
(six characters long). You'll notice I still have a+1
on the end - that's because you need space for the string null terminator.Use:
Since the digits will be represented as char you have to use sizeof(char) instead of sizeof(int).
One way of doing it (not necessarily recommended) that gives you the exact size of the number in characters is using the
stdio
functions themselves.For example, if you print the number (somewhere, for whatever reason) before you allocate your memory, you can use the
%n
format identifier withprintf
.%n
doesn't print anything; rather, you supply it with a pointer toint
, andprintf
fills that with how many characters have been written so far.Another example is
snprintf
, if you have it available. You pass it the maximum number of characters you want it to write to your string, and it returns the number of characters it should have written, not counting the final nul. (Or -1 on error.) So, using a 1-byte dummy string,snprintf
can tell you exactly how many characters your number is.A big advantage to using these functions is that if you decide to change the format of your number (leading 0's, padding spaces, octal output, long longs, whatever) you will not overrun your memory.
If you have GNU extensions to
stdio
, you may want to consider usingasprintf
. This is exactly likesprintf
, except it does the memory allocation for you! No assembly required. (Although you do need to free it yourself.) But you shouldn't rely on it to be portable.