I have an ASP.NET application with the following set up:
- A camera that captures raw RGB frames at a resolution 656x492
- These frames are processed in my C# code (with some simple image processing)
- The raw image is stored in a byte array (as well as wrapped in a Bitmap container)
- MISSING MAGIC: Convert raw image buffer to WebM stream
- On the other end I have a function
HttpResponseMessage
function that hooks up a WebM stream with aPushStreamContent
function (inspired by this blog post). This function pushes chunks of a video file to the website. - A Website that plays back the video.
I am struggling to figure out how to implement point 4. Right now I can only stream video files. But I would like to encode my raw buffer into a WebM container and stream that to my website. The central piece of code of point5 looks as follows:
while (length > 0 && bytesRead > 0)
{
bytesRead = video.Read(buffer, 0, Math.Min(length, buffer.Length));
await outputStream.WriteAsync(buffer, 0, bytesRead);
length -= bytesRead;
}
Basically I would like to replace the video.Read
function by somehow encoding my raw frames into a WebM format on the fly and storing them in buffer
, so they can be pushed to the website as a live stream. Is there a straight forward way to do this? It's fine if some frames get dropped.
If there is an entirely different approach that is better then I am of course also open for suggestions.
Depending on what you can do on the server (outside of deploying a web app), you might consider writing your buffer into a pipe, then use
ffmpeg
running in the background to create your stream from it with something like-f rawvideo -pixel_format rgb24 -video_size 656x492
as input parameters.The WebM Project offers DirectShow filters for playing and encoding WebM
There is also a FFmpegInterop Microsoft initiative which uses the FFmpeg multimedia framework.