Inserting Unicode Hyphen-minus into a string causes an error

I am trying to insert a hyphen minus unicode in a text string. I see an "Invalid universal character" error with the following:

u + 002D (hyphen minus)

[textViewContent insertString:@"\u002D" atIndex:cursorPosition.location]; 

However, they work fine:

u + 2212 (minus)

 [textViewContent insertString:@"\u2212" atIndex:cursorPosition.location]; 

u + 2010 (hyphen)

 [textViewContent insertString:@"\u2010" atIndex:cursorPosition.location]; 

I have already talked about several existing Unicode discussions, but have not found what explains what is different from my examples, which lead to the first error. Insight is much appreciated.

0
source share
1 answer

Unlimited character names have some restrictions on their use. In C99 and C ++ 98, you were not allowed to use the one that referenced the character in the base character set (which includes U + 002D).

C ++ 11 updated this requirement, so if you are inside a string or character literal, you are allowed to use a UCN that refers to the main characters. Depending on the version of the compiler you are using, I would suggest that you can use Objective-C ++ 11 to make your code legal.

However, since this character is part of ASCII and the basic character set, why don't you just write it literally?

 @"-" 
+2
source

Source: https://habr.com/ru/post/1437418/


All Articles