Re-encode url from utf-8 encoded to iso-8859-1 enc

2019-05-07 04:56发布

I have file:// links with non-english characters which are UrlEncoded in UTF-8. For these links to work in a browser I have to re-encode them.

file://development/H%C3%A5ndplukket.doc

becomes

file://development/H%e5ndplukket.doc

I have the following code which works:

public string ReEncodeUrl(string url)
{
    Encoding enc = Encoding.GetEncoding("iso-8859-1");
    string[] parts = url.Split('/');
    for (int i = 1; i < parts.Length; i++)
    {
        parts[i] = HttpUtility.UrlDecode(parts[i]); // Decode to string
        parts[i] = HttpUtility.UrlEncode(parts[i], enc); // Re-encode to latin1
        parts[i] = parts[i].Replace('+', ' '); // Change + to [space]
    }
    return string.Join("/", parts);
}

Is there a cleaner way of doing this?

3条回答
成全新的幸福
2楼-- · 2019-05-07 05:05

admittedly ugly and not really an improvement, but could re-encode the whole thing (avoid the split/iterate/join) then .Replace("%2f", "/")

I don't understand the code wanting to keep a space in the final result - seems like you don't end up with something that's actually encoded if it still has spaces in it?

查看更多
兄弟一词,经得起流年.
3楼-- · 2019-05-07 05:07

While I don't see any real way of changing it that would make a difference, shouldn't the + to space replace be before you UrlEncode so it turns into %20?

查看更多
仙女界的扛把子
4楼-- · 2019-05-07 05:17

I think that's pretty clean actually. It's readable and you said it functions correctly. As long as the implementation is hidden from the consumer, I wouldn't worry about squeezing out that last improvement.

If you are doing this operation excessively (like hundreds of executions per event) I would think about taking the implementation out of UrlEncode/UrlDecode and stream them into each other to get a performance improvement there by removing the need for string split/join, but testing would have to prove that out anyway and definitely wouldn't be "clean" :-)

查看更多
登录 后发表回答