How do you convert any character input from the user to its corresponding decimal value? I was just having trouble getting started.
The program has to achieve the following things:
The program accepts character from keyboard.
If the character is a digit (‘0’ through ‘9’): a) Convert the character to its corresponding decimal value. In other words, ‘0’ becomes zero, ‘1’ becomes 1, ... ‘9’ becomes 9. Let’s call that value R (for “run length”). b) Wait for another character (using GETC). c) Print R copies of that character to the console. ) d) Go back to Step 1.
Else, if the character is Enter/Return (ASCII #10): Print a linefeed (ASCII #10) to the console, and go back to Step 1.
Else, if the character is anything else, halt the program.
You convert decimal digit character it to number subtracting '0' (=0x30) from it. For hex digits ('A'to 'F'): If character is greater than '@', you subtract 0x37 from it ('A' -> 0x0a). For hex digits ('a'to 'f'): If the value is still bigger than 15, you subtract 0x20 from it Or you can use a table for mapping. 256 bytes is not bery big table.