Why and how does this line work:
let guess: f64 = "42".parse().expect("Not a number!");
But this does not?
let guess: i32 = "42.0".parse().expect("Not a number!");
Resulting in:
thread 'main' panicked at 'Not a number!: ParseIntError { kind: InvalidDigit }'
What is the correct way to parse "float" &str to integer?
Update:
I found this to work:
let guess: i32 = "42.0".parse::<f64>().expect("Not a number!") as i32;
However I don't understand the mechanics of how it works and if it is the correct way to do it?
What you're calling is actually let guess: i32 = "42.0".parse::<i32>();
.
However, "42.0"
is not a correct representation of an i32
.
According to the documentation:
Because parse
is so general, it can cause problems with type
inference. As such, parse
is one of the few times you'll see the
syntax affectionately known as the 'turbofish': ::<>
. This helps the
inference algorithm understand specifically which type you're trying
to parse into.
The correct solution which you've already found is indeed to hint at the parser that the string is the representation of a float:
"42.0".parse::<f64>()