Performance comparison of IEnumerable and raising

2020-03-06 07:24发布

I want to read big binary file containing millions of records and I want to get some reports for the records. I use BinaryReader to read (which I think has the best performance in readers) and convert read bytes to data model. Due to the count of records, passing model to the report layer is another issue: I prefer to use IEnumerable to have LINQ functionality and features when developing the reports.

Here is sample data class:

Public Class MyData
    Public A1 As UInt64
    Public A2 As UInt64
    Public A3 As Byte
    Public A4 As UInt16
    Public A5 As UInt64
End Class

I used this sub to create the file:

Sub CreateSampleFile()
    Using streamWriter As New FileStream(fileName, FileMode.Create, FileAccess.Write, FileShare.Write)
        For i As Integer = 1 To 1000
            For j As Integer = 1 To 1000
                For k = 1 To 30
                    Dim item As New MyData With {.A1 = i, .A2 = j, .A3 = k, .A4 = j, .A5 = i * j}
                    Dim bytes() As Byte = BitConverter.GetBytes(item.A1).Concat(BitConverter.GetBytes(item.A2)).Concat({item.A3}).Concat(BitConverter.GetBytes(item.A4)).Concat(BitConverter.GetBytes(item.A5)).ToArray
                    streamWriter.Write(bytes, 0, bytes.Length)
                Next
            Next
        Next
    End Using
End Sub

And here is my reader class:

Imports System.IO

Public Class FileReader

    Public Const BUFFER_LENGTH As Long = 4096 * 256 * 27
    Public Const MY_DATA_LENGTH As Long = 27
    Private _buffer(BUFFER_LENGTH - 1) As Byte
    Private _streamWriter As FileStream
    Public Event OnByteRead(sender As FileReader, bytes() As Byte, index As Long)

    Public Sub StartReadBinary(fileName As String)
        Dim currentBufferReadCount As Long = 0
        Using fileStream As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.Read)
            Using streamReader As New BinaryReader(fileStream)
                currentBufferReadCount = streamReader.Read(Me._buffer, 0, Me._buffer.Length)
                While currentBufferReadCount > 0
                    For i As Integer = 0 To currentBufferReadCount - 1 Step MY_DATA_LENGTH
                        RaiseEvent OnByteRead(Me, Me._buffer, i)
                    Next
                    currentBufferReadCount = streamReader.Read(Me._buffer, 0, Me._buffer.Length)
                End While
            End Using
        End Using
    End Sub

    Public Iterator Function GetAll(fileName As String) As IEnumerable(Of MyData)
        Dim currentBufferReadCount As Long = 0
        Using fileStream As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.Read)
            Using streamReader As New BinaryReader(fileStream)
                currentBufferReadCount = streamReader.Read(Me._buffer, 0, Me._buffer.Length)
                While currentBufferReadCount > 0
                    For i As Integer = 0 To currentBufferReadCount - 1 Step MY_DATA_LENGTH
                        Yield GetInstance(_buffer, i)
                    Next
                    currentBufferReadCount = streamReader.Read(Me._buffer, 0, Me._buffer.Length)
                End While
            End Using
        End Using
    End Function

    Public Function GetInstance(bytes() As Byte, index As Long) As MyData
        Return New MyData With {.A1 = BitConverter.ToUInt64(bytes, index), .A2 = BitConverter.ToUInt64(bytes, index + 8), .A3 = bytes(index + 16), .A4 = BitConverter.ToUInt16(bytes, index + 17), .A5 = BitConverter.ToUInt64(bytes, index + 19)}
    End Function

End Class

I was thinking about the IEnumerable performance, so I tried to use both GetAll method as IEnumerable and raising event for each record which is read from file. Here is the test module:

Imports System.IO

Module Module1

    Private fileName As String = "MyData.dat"
    Private readerJustTraverse As New FileReader
    Private WithEvents readerWithoutInstance As New FileReader
    Private WithEvents readerWithInstance As New FileReader
    Private readerIEnumerable As New FileReader

    Sub Main()

        Dim s As New Stopwatch

        s.Start()
        readerJustTraverse.StartReadBinary(fileName)
        s.Stop()
        Console.WriteLine("Read bytes: {0}", s.ElapsedMilliseconds)

        s.Restart()
        readerWithoutInstance.StartReadBinary(fileName)
        s.Stop()
        Console.WriteLine("Read bytes, raise event: {0}", s.ElapsedMilliseconds)

        s.Restart()
        readerWithInstance.StartReadBinary(fileName)
        s.Stop()
        Console.WriteLine("Read bytes, raise event, get instance: {0}", s.ElapsedMilliseconds)

        s.Restart()
        For Each item In readerIenumerable.GetAll(fileName)

        Next
        Console.WriteLine("Read bytes, get instance, return yield: {0}", s.ElapsedMilliseconds)
        s.Stop()

        Console.ReadLine()

    End Sub

    Private Sub readerWithInstance_OnByteRead(sender As FileReader, bytes() As Byte, index As Long) Handles readerWithInstance.OnByteRead
        Dim item As MyData = sender.GetInstance(bytes, index)
    End Sub

    Private Sub readerWithoutInstance_OnByteRead(sender As FileReader, bytes() As Byte, index As Long) Handles readerWithoutInstance.OnByteRead
        'do nothing
    End Sub

End Module

The thing which I'm wondering is elapsed time for each process, here is the test result (tested on ASUS Ultrabook - Zenbook Core i7):

Read bytes: 384 (without touching the read bytes!)

Read bytes, raise event: 583

Read bytes, raise event, get instance: 3923

Read bytes, get instance, return yield: 4917

It shows that reading file as byte is incredibly fast, and converting bytes to the model is slow. Also raising event instead of getting IEnumerable result, is 25% faster.

Is iterating in IEnumerable is really has this performance cost or I missed something?

1条回答
Emotional °昔
2楼-- · 2020-03-06 07:51

Yes, using Iterator Functions carries a performance penalty.

I compiled your code and I got the same results as you did. I looked at the generated IL code. The state machine created from the GetAll method does contain a lot of stuff but most of the instructions are nop's or simple operations.

The results with/without using the iterator functions differ, as you say, by 25%. That's not too much. When you are using StartReadBinary, there is simply one big cycle which calls the OnByteRead method (via the event) three billion times. However, when you create the objects in a foreach cycle, what you must do for each object is call the GetCurrent() method and MoveNext() of the generated enumerator, the latter of which is not trivial (most of the code from GetAll was moved there) and uses quite an amount of compiler-generated variables.

Using "Yield" generally slows down your program because the compiler has to create complicated IL code to represent the state machine.

查看更多
登录 后发表回答