How can I improve performance of UTF32 to UTF16 su

2019-07-27 17:36发布

I am using following converter to do the splits of Unicode chars not in 'normal' plane.

function toUTF16Pair(x) {
        var first = Math.floor((x - 0x10000) / 0x400) + 0xD800;
        var second = ((x - 0x10000) % 0x400) + 0xDC00;
        return '\\u'+first.toString(16) + '\\u'+second.toString(16);
}

I am looking for performance improvement (if one is possible).

1条回答
beautiful°
2楼-- · 2019-07-27 18:24

As usual did it myself with some binary magic. Please try to beat this.

function toUTF16Pair(x) {
    return '\\u' + ((((x - 0x10000) >> 0x0a) | 0x0) + 0xD800).toString(16) 
         + '\\u' + (((x - 0x10000) & 0x3FF) + 0xDC00).toString(16)
}

In case anyone is wondering how it works:
>> 0x0a - is binary right shift 10 positions that is equivalent to division by 1024.
| 0x0 - is equivalent of Math.floor
& 0x3FF - because modulo of 2 can be expressed as (x % n == x & (n - 1)) which in my case is & 1063 in decimal.

Hope this saves you some time.

查看更多
登录 后发表回答