Swift string count () vs NSString.length not equal

Why do these two lines give me different results?

var str = "Hello 😘" // the square is an emoji count(str) // returns 7 (str as NSString).length // returns 8 

Original for reference:

enter image description here

+6
source share
2 answers

This is because Swift uses advanced Grapheme clusters. Swift sees the emoticon as one character, but the NSString method treats it as two Unicode characters, although they are "combined" and represent one character.

+7
source

I think the documentation says this is the best:

The number of characters returned by the count (_ :) function does not always match the NSString length value containing the same characters. The length of an NSString is based on the number of 16-bit code units in the UTF-16 string representation, and not on the number of Unicode extended grapheme clusters per line. To reflect this fact, the length property of NSString is called utf16Count when it is accessed by the value of the Swift String.

+6
source

Source: https://habr.com/ru/post/985816/


All Articles