Why is Swift Decimal Returning Number from String

2019-08-20 06:05发布

问题:

I am working with Swift's Decimal type, trying to ensure that an user-entered String is a valid Decimal.

I have two String values, each including a letter, within my Playground file. One of the values contains a letter at the start, while the other contains a letter at the end. I initialize a Decimal using each value, and only one Decimal initialization fails; the Decimal initialized with the value that contains the letter at the beginning.

Why does the Decimal initialized with a value that contains a letter at the end return a valid Decimal? I expect nil to be returned.

Attached is a screenshot from my Playground file.

回答1:

It works this way because Decimal accepts any number values before the letters. The letters act as a terminator for any numbers that comes after it. So in your example:

12a = 12   ( a is the terminator in position 3 )
a12 = nil  ( a is the terminator in position 1 )

If wanting both to be invalid if the string contains a letter then you could use Float instead.