Allocating memory for a char array to concatenate

2019-07-23 19:54发布

I want to concatenate a piece of text, for example "The answer is " with a signed integer, to give the output "The number is 42".

I know how long the piece of text is (14 characters) but I don't know how many characters the string representation of the number will be.

I assume the worst case scenario, the largest signed 16-bit integer has 5 digits, plus one extra in case it is negative, so is the following code the correct way to do it?

#include <stdio.h>
#include <stdlib.h>

int main()
{
    char *message;

    message = malloc(14*sizeof(char)+(sizeof(int)*5)+1);

    sprintf(message, "The answer is %d", 42);

    puts(message);

    free(message);
}

6条回答
Juvenile、少年°
2楼-- · 2019-07-23 20:13

I think that the correct formula to get the maximum lenght of the decimal representation of an integer would be (floor(log10(INT_MAX))+1); you could also abuse of the preprocessor in this way:

#include <limits.h>
#define TOSTRING_(x) #x
#define TOSTRING(x) TOSTRING_(x)
/* ... */
#define YOUR_MESSAGE "The answer is "
char message[]=YOUR_MESSAGE "+" TOSTRING(INT_MAX);
sprintf(message+sizeof(YOUR_MESSAGE),"%d", 42);

, which also avoids the heap allocation. You may want to use snprintf for better security, although with this method it shouldn't be necessary.

Another trick like that would be to create a function like this:

size_t GetIntMaxLenght()
{
    const char dummy[]=TOSTRING(INT_MAX);
    return sizeof(dummy)+1;
}

if the compiler is smart enough it could completely sweep away the dummy var from the compiled code, otherwise it may be wise to declare that var as static to avoid reinitializing it every time the function is called.

查看更多
戒情不戒烟
3楼-- · 2019-07-23 20:15

A safe approximation for signed int is (number of digits including the potential - sign):

(CHAR_BIT * sizeof(int) + 1) / 3 + 1

The equivalent for unsigned is:

(CHAR_BIT * sizeof(unsigned) + 2) / 3

This calculates the number of digits - add one to both of them to account for the terminator, if allocating space for a null-terminated string.

This will slightly overestimate the space required for very long types (and will also overestimate in the unusual case where int has padding bits), but is a good approximation and has the advantage that it is a compile-time constant. CHAR_BIT is provided by <limits.h>.

查看更多
我想做一个坏孩纸
4楼-- · 2019-07-23 20:17

malloc((14 + 6 + 1) * sizeof(char));

  • 14 char for the string
  • 6 for de digits + sign
  • 1 for the '\0'

Note : Sizeof(int) gives you the size of the type in byes. Sizeof(int) == 4 if the int is 32bits, 8 if it's a 64bits.

查看更多
可以哭但决不认输i
5楼-- · 2019-07-23 20:20

Not quite, you only need a number of characters so sizeof(int) is not required.

However, for easily maintainable and portable code, you should have something like:

#define TEXT "The answer is "
#undef CHARS_PER_INT
#if INT_MAX == 32767
    #define CHARS_PER_INT 6
#endif
#if INT_MAX == 2147483647
    #define CHARS_PER_INT 11
#endif
#ifndef CHARS_PER_INT
    #error Suspect system, I have no idea how many chars to allocate for an int.
#endif

int main (void) {
    char *message;

    message = malloc(sizeof(TEXT)+CHARS_PER_INT+1);
    sprintf(message, TEXT "%d", 42);
    puts(message);
    free(message);
    return 0;
}

This has a number of advantages:

  • If you change the string, you change one thing and one thing only. The argument to malloc adjusts automatically.
  • The expression sizeof(TEXT)+CHARS_PER_INT+1 is calculated at compile time. A solution involving strlen would have a runtime cost.
  • If you try to compile your code on a system where integers may cause overflow, you'll be told about it (go fix the code).
  • You should actually allocate an extra character for the number since the biggest 16-bit number (in terms of character count) is -32768 (six characters long). You'll notice I still have a +1 on the end - that's because you need space for the string null terminator.
查看更多
祖国的老花朵
6楼-- · 2019-07-23 20:21

Use:

malloc(14*sizeof(char) /*for the 14 char text*/
       +(sizeof(char)*5) /*for the magnitude of the max number*/
       +1 /* for the sign of the number*/
       +1 /* for NULL char*/
      );

Since the digits will be represented as char you have to use sizeof(char) instead of sizeof(int).

查看更多
甜甜的少女心
7楼-- · 2019-07-23 20:21

One way of doing it (not necessarily recommended) that gives you the exact size of the number in characters is using the stdio functions themselves.

For example, if you print the number (somewhere, for whatever reason) before you allocate your memory, you can use the %n format identifier with printf. %n doesn't print anything; rather, you supply it with a pointer to int, and printf fills that with how many characters have been written so far.

Another example is snprintf, if you have it available. You pass it the maximum number of characters you want it to write to your string, and it returns the number of characters it should have written, not counting the final nul. (Or -1 on error.) So, using a 1-byte dummy string, snprintf can tell you exactly how many characters your number is.

A big advantage to using these functions is that if you decide to change the format of your number (leading 0's, padding spaces, octal output, long longs, whatever) you will not overrun your memory.

If you have GNU extensions to stdio, you may want to consider using asprintf. This is exactly like sprintf, except it does the memory allocation for you! No assembly required. (Although you do need to free it yourself.) But you shouldn't rely on it to be portable.

查看更多
登录 后发表回答