This Stack Overflow question deals with 16-bit Unicode characters. I would like a similar solution that supports 32-bit characters. See this link for a listing of the various Unicode charts. For example, a range of characters that are 32-bit are the Musical Symbols.
The answer in the question linked above doesn't work because it casts the System.Int32 value as a System.Char, which is a 16-bit type.
Edit: Let me clarify that I don't particularly care about displaying the 32-bit Unicode character, I just want to store the character in a string variable.
Edit #2: I wrote a PowerShell snippet that uses the info in the marked answer and its comments. I would have wanted to put this in another comment, but comments can't be multi-line.
$inputValue = '1D11E'
$hexValue = [int]"0x$inputValue" - 0x10000
$highSurrogate = [int]($hexValue / 0x400) + 0xD800
$lowSurrogate = $hexValue % 0x400 + 0xDC00
$stringValue = [char]$highSurrogate + [char]$lowSurrogate
Dour High Arch still deserves credit for the answer for helping me finally understand surrogate pairs.
IMHO, the most elegant way to use Unicode literals in PowerShell is
See my blogpost for more details
Assuming PowerShell uses UTF-16, 32-bit code points are represented as surrogates. For example, U+10000 is represented as:
That is, two 16-bit chars; hex D100 and DC00.
Good luck finding a font with surrogate chars.