Why do these two lines give me different results?
var str = "Hello 😘" // the square is an emoji count(str) // returns 7 (str as NSString).length // returns 8
Original for reference:
This is because Swift uses advanced Grapheme clusters. Swift sees the emoticon as one character, but the NSString method treats it as two Unicode characters, although they are "combined" and represent one character.
I think the documentation says this is the best:
The number of characters returned by the count (_ :) function does not always match the NSString length value containing the same characters. The length of an NSString is based on the number of 16-bit code units in the UTF-16 string representation, and not on the number of Unicode extended grapheme clusters per line. To reflect this fact, the length property of NSString is called utf16Count when it is accessed by the value of the Swift String.
Source: https://habr.com/ru/post/985816/More articles:Problems with gulp encoding in Windows - windowsFunction Resolution C ++ selects the template version by a simple function - c ++Invalid WeasyPrint page size. (8.27 inches x 11.69 inches) - pythonYii2 DepDrop kartik - phpDisable Google Play Developer Console Notification - google-playJsoncpp not correctly writing float values - c ++Static constructor does not work for structures - c #How to prevent fake form validation - javascriptRedis Keyless Event - node.jsFlask on Demand - pythonAll Articles