Hex String to Character in PURE Swift

2020-03-02 03:46发布

问题:

I need a way to convert a string that contains a literal string representing a hexadecimal value into a Character corresponding to that particular hexadecimal value.

Ideally, something along these lines:

let hexString: String = "2C"
let char: Character = fromHexString(hexString)
println(char)   // prints -> ","

I've tried to use the syntax: "\u{n}" where n is a Int or String and neither worked.

This could be used to loop over an array of hexStrings like so:

var hexArray = ["2F", "24", "40", "2A"]
var charArray = [Character]()
charArray = map(hexArray) { charArray.append(Character($0)) }
charArray.description // prints -> "[/, $, @, *]"

回答1:

A couple of things about your code:

var charArray = [Character]()
charArray = map(hexArray) { charArray.append(Character($0)) }

You don't need to create an array and then assign the result of the map, you can just assign the result and avoid creating an unnecessary array.

charArray = map(hexArray) { charArray.append(Character($0)) }

Here you can use hexArray.map instead of map(hexArray), also when you use a map function what you are conceptually doing is mapping the elements of the receiver array to a new set of values and the result of the mapping is the new "mapped" array, which means that you don't need to do charArray.append inside the map closure.

Anyway, here is a working example:

let hexArray = ["2F", "24", "40", "2A"]
var charArray = hexArray.map { char -> Character in
    let code = Int(strtoul(char, nil, 16))
    return Character(UnicodeScalar(code))
}
println(charArray) // -> [/, $, @, *]

EDIT: This is another implementation that doesn't need Foundation:

func hexToScalar(char: String) -> UnicodeScalar {
    var total = 0
    for scalar in char.uppercaseString.unicodeScalars {
        if !(scalar >= "A" && scalar <= "F" || scalar >= "0" && scalar <= "9") {
            assertionFailure("Input is wrong")
        }

        if scalar >= "A" {
            total = 16 * total + 10 + scalar.value - 65 /* 'A' */
        } else {
            total = 16 * total + scalar.value - 48 /* '0' */
        }
    }
    return UnicodeScalar(total)
}

let hexArray = ["2F", "24", "40", "2A"]
var charArray = hexArray.map { Character(hexToScalar($0)) }
println(charArray)

EDIT2 Yet another option:

func hexToScalar(char: String) -> UnicodeScalar {
    let map = [ "0": 0, "1": 1, "2": 2, "3": 3, "4": 4, "5": 5, "6": 6, "7": 7, "8": 8, "9": 9,
        "A": 10, "B": 11, "C": 12, "D": 13, "E": 14, "F": 15 ]

    let total = reduce(char.uppercaseString.unicodeScalars, 0, { $0 * 16 + (map[String($1)] ?? 0xff) })
    if total > 0xFF {
        assertionFailure("Input char was wrong")
    }
    return UnicodeScalar(total)
}

Final edit: explanation

Given that the ascii table has all the number together (012345679), we can convert 'N' (base 10) to an integer knowing the ascii value of 0.

Because:

'0': 48
'1': 49
...
'9': 57

Then if for example you need to convert '9' to 9 you could do

asciiValue('9') - asciiValue('0') => 57 - 48 = 9

And you can do the same from 'A' to 'F':

'A': 65
'B': 66
...
'F': 70

Now we can do the same as before but, for example for 'F' we'd do:

asciiValue('F') - asciiValue('A') => 70 - 65 = 5

Note that we need to add 10 to this number to get the decimal. Then (going back to the code): If the scalar is between A-Z we need to do:

10 + asciiValue(<letter>) - asciiValue('A')

which is the same as: 10 + scalar.value - 65

And if it's between 0-9:

asciiValue(<letter>) - asciiValue('0')

which is the same as: scalar.value - 48

For example: '2F'

'2' is 2 and 'F' is 15 (by the previous example), right?. Since hex is base 16 we'd need to do:

((16 ^ 1) * 2) + ((16 ^ 0) * 15) = 47



回答2:

Here you go:

var string = String(UnicodeScalar(Int("2C", radix: 16)!))

BTW, you can include hex values in the literal strings like this:

var string = "\u{2c}"


回答3:

With Swift 5, you will have to convert your string variable into an integer (using init(_:radix:) initializer), create Unicode scalar from this integer (using init(_:)) then create a character from this Unicode scalar (using init(_:)).

The Swift 5 Playground sample code below shows how to proceed:

let validHexString: String = "2C"
let validUnicodeScalarValue = Int(validHexString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
let character = Character(validUnicodeScalar)
print(character) // prints: ","

If you want to perform this operation for the elements inside an array, you can use the sample code below:

let hexArray = ["2F", "24", "40", "2A"]
let characterArray = hexArray.map({ (hexString) -> Character in
    let unicodeScalarValue = Int(hexString, radix: 16)!
    let validUnicodeScalar = Unicode.Scalar(unicodeScalarValue)!
    return Character(validUnicodeScalar)
})
print(characterArray) // prints: ["/", "$", "@", "*"]

Alternative with no force unwraps:

let hexArray = ["2F", "24", "40", "2A"]
let characterArray = hexArray.compactMap({ (hexString) -> Character? in
    guard let unicodeScalarValue = Int(hexString, radix: 16),
        let unicodeScalar = Unicode.Scalar(unicodeScalarValue) else {
            return nil
    }
    return Character(unicodeScalar)
})
print(characterArray) // prints: ["/", "$", "@", "*"]


回答4:

Another simple way based on ICU transforms:

extension String {
  func transformingFromHex() -> String? {
    return "&#x\(self);".applyingTransform(.toXMLHex, reverse: true)
  }
}

Usage:

"2C".transformingFromHex()

Results in: ,