How do I convert encoding of a large file (>1

2019-03-18 08:58发布

Consider:

public static void ConvertFileToUnicode1252(string filePath, Encoding srcEncoding)
{
    try
    {
        StreamReader fileStream = new StreamReader(filePath);
        Encoding targetEncoding = Encoding.GetEncoding(1252);

        string fileContent = fileStream.ReadToEnd();
        fileStream.Close();

        // Saving file as ANSI 1252
        Byte[] srcBytes = srcEncoding.GetBytes(fileContent);
        Byte[] ansiBytes = Encoding.Convert(srcEncoding, targetEncoding, srcBytes);
        string ansiContent = targetEncoding.GetString(ansiBytes);

        // Now writes contents to file again
        StreamWriter ansiWriter = new StreamWriter(filePath, false);
        ansiWriter.Write(ansiContent);
        ansiWriter.Close();
        //TODO -- log success  details
    }
    catch (Exception e)
    {
        throw e;
        // TODO -- log failure details
    }
}

The above piece of code returns an out-of-memory exception for large files and only works for small-sized files.

3条回答
混吃等死
2楼-- · 2019-03-18 09:41

Try this:

using (FileStream fileStream = new FileStream(filePath, FileMode.Open))
{
    int size = 4096;
    Encoding targetEncoding = Encoding.GetEncoding(1252);
    byte[] byteData = new byte[size];

    using (FileStream outputStream = new FileStream(outputFilepath, FileMode.Create))
    {
        int byteCounter = 0;

        do
        {
            byteCounter = fileStream.Read(byteData, 0, size);

            // Convert the 4k buffer
            byteData = Encoding.Convert(srcEncoding, targetEncoding, byteData);

            if (byteCounter > 0)
            {
                outputStream.Write(byteData, 0, byteCounter);
            }
        }
        while (byteCounter > 0);

        inputStream.Close();
    }
}

Might have some syntax errors as I've done it from memory but this is how I work with large files, read in a chunk at a time, do some processing and save the chunk back. It's really the only way of doing it (streaming) without relying on massive IO overhead of reading everything and huge RAM consumption of storing it all, converting it all in memory and then saving it all back.

You can always adjust the buffer size.

If you want your old method to work without throwing the OutOfMemoryException, you need to tell the Garbage Collector to allow very large objects.

In App.config, under <runtime> add this following line (you shouldn't need it with my code but it's worth knowing):

<gcAllowVeryLargeObjects enabled="true" />
查看更多
Anthone
3楼-- · 2019-03-18 09:50

Don't readToEnd and read it like line by line or X characters at a time. If you read to end, you put your whole file into the buffer at once.

查看更多
爷的心禁止访问
4楼-- · 2019-03-18 09:53

I think still using a StreamReader and a StreamWriter but reading blocks of characters instead of all at once or line by line is the most elegant solution. It doesn't arbitrarily assume the file consists of lines of manageable length, and it also doesn't break with multi-byte character encodings.

public static void ConvertFileEncoding(string srcFile, Encoding srcEncoding, string destFile, Encoding destEncoding)
{
    using (var reader = new StreamReader(srcFile, srcEncoding))
    using (var writer = new StreamWriter(destFile, false, destEncoding))
    {
        char[] buf = new char[4096];
        while (true)
        {
            int count = reader.Read(buf, 0, buf.Length);
            if (count == 0)
                break;

            writer.Write(buf, 0, count);
        }
    }
}

(I wish StreamReader had a CopyTo method like Stream does, if it had, this would be essentially a one-liner!)

查看更多
登录 后发表回答