I am using following converter to do the splits of Unicode chars not in 'normal' plane.
function toUTF16Pair(x) {
var first = Math.floor((x - 0x10000) / 0x400) + 0xD800;
var second = ((x - 0x10000) % 0x400) + 0xDC00;
return '\\u'+first.toString(16) + '\\u'+second.toString(16);
}
I am looking for performance improvement (if one is possible).
As usual did it myself with some binary magic. Please try to beat this.
In case anyone is wondering how it works:
>> 0x0a
- is binary right shift 10 positions that is equivalent to division by 1024.| 0x0
- is equivalent ofMath.floor
& 0x3FF
- because modulo of 2 can be expressed as (x % n == x & (n - 1)
) which in my case is& 1063
in decimal.Hope this saves you some time.