This stack overflow question is related to 16-bit Unicode characters. I would like this solution to support 32 bit characters. See this link for a list of different Unicode schedules . For example, a range of characters that are 32-bit are Musical characters .
The answer in the question above does not work, because it distinguishes the value of System.Int32 as System.Char, which is a 16-bit type.
Edit: Let me make it clear that I don't really care about the 32-bit Unicode character, I just want to save the character in a string variable.
Edit # 2: I wrote a PowerShell snippet that uses the information in the marked answer and its comments. I would like to put this in another comment, but comments cannot be multi-line.
$inputValue = '1D11E' $hexValue = [int]"0x$inputValue" - 0x10000 $highSurrogate = [int]($hexValue / 0x400) + 0xD800 $lowSurrogate = $hexValue % 0x400 + 0xDC00 $stringValue = [char]$highSurrogate + [char]$lowSurrogate
Dour High Arch is still credible for the answer, helping me finally understand surrogate pairs.
source share