How do you get the length of a String
? For example, I have a variable defined like:
var test1: String = \"Scott\"
However, I can\'t seem to find a length method on the string.
How do you get the length of a String
? For example, I have a variable defined like:
var test1: String = \"Scott\"
However, I can\'t seem to find a length method on the string.
As of Swift 4
It\'s just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = \"Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪\"
println(\"unusualMenagerie has \\(count(unusualMenagerie)) characters\")
// prints \"unusualMenagerie has 40 characters\"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie)
instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count
For Swift 2.0 and 3.0, use test1.characters.count
. But, there are a few things you should know. So, read on.
Before Swift 2.0, count
was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String
, so it\'s the most correct alternative in the sense that, if you\'d print the string and count characters by hand, you\'d get the same result.
However, because of the way Strings
are implemented in Swift, characters don\'t always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count
on characters
.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode scalars. This means that different characters—and different representations of the same character—can require different amounts of memory to store. Because of this, characters in Swift do not each take up the same amount of memory within a string’s representation. As a result, the number of characters in a string cannot be calculated without iterating through the string to determine its extended grapheme cluster boundaries. If you are working with particularly long string values, be aware that the characters property must iterate over the Unicode scalars in the entire string in order to determine the characters for that string.
The count of the characters returned by the characters property is not always the same as the length property of an NSString that contains the same characters. The length of an NSString is based on the number of 16-bit code units within the string’s UTF-16 representation and not the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = \"