For an exercise I'm doing for exercism (the minesweeper task), I need to convert an usize
to a char
in order to insert it into a std::string::String
.
To describe the problem in minimal lines of code:
let mut s = String::from(" ");
let mine_count: usize = 5; // This is returned from a method and will be a value between 1 and 8.
s.insert(0, _______); // So I get: "5 " at the underscores I do:
The way I'm currently doing this as:
mine_count.to_string().chars().nth(0).unwrap(); // For example: '2'
Or see the full example in the rust playground. Somehow this doesn't strike me as elegant.
I've also tried:
mine_count as char; // where mine_count is of type u8
However when adding mine_count
to a std::string::String
it turns up as - for example - \u{2}
and not simply '2'
:
let mine_count: u8 = 8;
s.insert(0, mine_count as char);
println!("{:?}", s);
The output:
"\u{8} "
Reproduced here.
Are there other ways to achieve the goal of converting an integer in the range of 1..8 to a single character (char
)?
Use a lookup table:
This is the difference between the
char
containing the scalar value2
and achar
containing the actual character'2'
. The first few UTF-8 values, like in ASCII text encoding, are reserved for control characters, and do not portray something visible. What made it appear as\u{2}
in this context is because you printed the string with debug formatting ({:?}
). If you try to print the same string with plain formatting:The output will contain something that wasn't meant to be printed, and so might either show a placeholder character or not appear at all (reproducible here).
In order to represent a single-digit number as the respective character: (1) First make sure that
mine_count
is within the intended limits, either by recoverable errors or hard assertions. (2) Then, transform the number by translating it to the numeric digit character domain.Playground
I suggest using
char::from_digit
together with a cast necessary to use it (as u32
):