I can do this:
let a: [f32; 3] = [0.0, 1.0, 2.0];
But why doesn't this work?
let a: [f32; _] = [0.0, 1.0, 2.0];
It seems to me that the length is redundant and trivial to infer. Is there a way to avoid having to specify it explicitly? (And without having to append f32
to all the literals.)
_
can only be used in two contexts: in patterns, to match a value to ignore, and as a placeholder for a type. In array types, the length is not a type, but an expression, and_
cannot be used in expressions.What you can do, though, is append
f32
to only one of the literals and omit the type completely. Since all the items of an array must have the same type, the compiler will infer the correct element type for the array.