I'm trying to read unknown number of integers by this piece of code:
while (1) {
int c = getchar ();
if (c == EOF)
break;
else if (isdigit (c))
current = current * 10 + (c - '0');
else {
total += current;
current = 0;
}
}
I know what current = current * 10 + (c - '0')
; does but I don't know why is there c - '0'
. Could you explain it to me? Thank you in advance.
c - '0'
is a rudimentary way of turning a single ASCII digit into an integer.
For example, if c
is equal to '9'
, then its integer value is 0x39
. The ASCII value for '0'
is 0x30
, so, 0x39-0x30 == 0x09
which is equal to the integer value 9
.
Here is the ASCII table for digits:
chr hex dec
'0' 0x30 48
'1' 0x31 49
'2' 0x32 50
'3' 0x33 51
'4' 0x34 52
'5' 0x35 53
'6' 0x36 54
'7' 0x37 55
'8' 0x38 56
'9' 0x39 57
The character '0'
does not have ASCII value 0. However, fortunately the character '1'
does have ASCII value '0' + 1
, and so forth through 2, 3, 4, 5, 6, 7, 8, and 9. Therefore (pseudo code) 'n' - '0' == n
holds for each digit.
The digits are represented in ASCII by increasing values: 0
is 48, 1
is 49, and so on. Since characters in C
are just integers in disguise, you can subtract the ASCII value of 0
from any other digit to obtain the numerical value of the digit. For example, '1' - '0'
is the same as 49 - 48
which is 1
.
C specifies the 10 decimal digits, not surprisingly, as
0 1 2 3 4 5 6 7 8 9
C11 §5.2.1 further states "In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous.
Thus when assessing a string for characters that are digits, the language guarantees that subtracting '0'
from a decimal digit char
will result in its integer value.
if (isdigit (c))
int value = c - '0';
This is not dependent on char
using ASCII.