I am attempting to collect data from a USB port using D3XX.NET from FTDI. The data is collected and then sent to a fast fourier transform for plotting a spectrum. This works fine, even if you miss some data. You can't tell. However, if you then want to send this data to an audio output component, you will notice data missing. This is where my problem appears to be.
The data is collected and then sent to the audio device. All packets are making it within the time span needed. However, the audio is dropping data it appears. Here is a picture of what a sine wave looks like at the output of the audio:
You can see that some data is missing at the beginning and it seems a whole cycle is missing near the end. This is just one example, it changes all the time. Sometimes it appears that the data is just not there.
I have gone through the whole processing chain and i'm pretty sure the data packets for the sound are making it.
I have since used JetBrains performance profiler. What I have found is the following: The ReadPipe method takes 8.5ms which is exactly what you expect the read to take. So far so good. Once the ReadPipe command is finished, you have 0.5ms to do another ReadPipe or you will loose some data. Looking at the profiler output I see this:
The ReadPipe takes 8.5ms and then there is this entry for garbage collection which on average takes 1.6ms. If this is indeed occurring even occasionally, then I have lost some data.
So here is the code: It is a backgroundworker:
private void CollectData(object sender, DoWorkEventArgs e)
{
while (keepGoing)
{
ftStatus = d3xxDevice.ReadPipe(0x84, iqBuffer, 65536, ref bytesTransferred); //read IQ data - will get 1024 pairs - 2 bytes per value
_waitForData.Set();
}
}
The waithandle signifies to the other thread that data is available.
So is the GC the cause of the lost data? And if so, how can I avoid this?
Thanks!
If you can confirm that you aren't running out of memory, you could try setting GCSettings.LatencyMode
to GCLatencyMode.SustainedLowLatency
. This will prevent certain blocking garbage collections from occurring, unless you're low on memory. Check out the docs on latency modes for more details and restrictions.
If garbage collection is still too disruptive for your use case and you're using .NET 4.6 or later, you may be able to try calling GC.TryStartNoGCRegion. This method will attempt to reserve enough memory to allocate up to the amount specified, and block GC until you've exhausted the reservation. If your memory usage is fairly consistent, you might be able to get away with passing in a large enough value to accommodate your application's usage, but there's no guarantee that the call will succeed.
If you're on an older version of .NET that doesn't support either of these, you're probably out of luck. If this is a GUI application (which it looks like, judging by the event handler), you don't have enough control over allocations.
Another thing to consider is that C# isn't really the right tool for applications that can't tolerate disruptions. If you're familiar with writing native code, you could perform your time sensitive work on an un-managed thread; as far as I'm aware, this is the only reliable solution, especially if your application is going to run on end-user machines.
You need to be friendlier to your garbage collector and not allocate so much.
In short, if your GC
is stalling your threads, you have a garbage problem. The GC will pause all threads to do a clean up and there is nothing you can really do apart form better management of what garbage you create.
If you have arrays, don't keep creating them constantly, instead reuse them (so on and so forth). Use lighter weight structures, use tools which allow you to reduce allocations like Span<T>
and Memory<T>
. Consider using less awaits
if your code is heavily async
, and don't put them in loops. Pass by ref
and use ref locals and such, also stay away from large unmanaged data blocks if you can.
Also, it might be beneficial to call GC.Collect
in any down time when it wont matter, though better design will likely be more beneficial.