How to convert an Int to Hex String in Swift

2020-01-28 02:31发布

In Obj-C I used to convert an unsigned integer n to a hex string with

 NSString *st = [NSString stringWithFormat:@"%2X", n];

I tried for a long time to translate this into Swift language, but unsuccessfully.

5条回答
我只想做你的唯一
2楼-- · 2020-01-28 02:52

In Swift there is a specific init method on String for exactly this:

let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
查看更多
做个烂人
3楼-- · 2020-01-28 02:53

To use

let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"

In Swift3 import foundation is not required, At least not in a Project. String should have all the functionality as NSString.

查看更多
做个烂人
4楼-- · 2020-01-28 02:54

With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.


#1. Using String's init(_:radix:uppercase:) initializer

Swift String has a init(_:radix:uppercase:) initializer with the following declaration:

init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger

Creates a string representing the given value in base 10, or some other specified base.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:

let string1 = String(2, radix: 16)
print(string1) // prints: "2"

let string2 = String(211, radix: 16)
print(string2) // prints: "d3"

let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"

#2. Using String's init(format:​_:​) initializer

Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:

init(format: String, _ arguments: CVarArg...)

Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.

The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:

Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):

import Foundation

let string1 = String(format:"%X", 2)
print(string1) // prints: "2"

let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"

let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"

let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"

#3. Using String's init(format:​arguments:​) initializer

Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:

init(format: String, arguments: [CVarArg])

Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):

import Foundation

let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"

let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"

let string3 = String(format:"%02X",  arguments: [211])
print(string3) // prints: "D3"

let string4 = String(format: "%02X, %02X, %02X",  arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
查看更多
We Are One
5楼-- · 2020-01-28 02:54

Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.

You need to use the length modifier for values greater than a 32bit Int

%x = Unsigned 32-bit integer (unsigned int)

ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.

let hexString = String(format:"%llX", decimalValue)
查看更多
▲ chillily
6楼-- · 2020-01-28 03:07

You can now do:

let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14

Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.

This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.

Use uppercase X if you want A...F and lowercase x if you want a...f:

String(format: "%x %X", 64206, 64206)  // "face FACE"

If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:

let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615

Original Answer

You can still use NSString to do this. The format is:

var st = NSString(format:"%2X", n)

This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:

var st = NSString(format:"%2X", n) as String

or

var st = String(NSString(format:"%2X", n))

or

var st: String = NSString(format:"%2X", n)

Then you can do:

let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
查看更多
登录 后发表回答