I am working on a simple program that grabs image from a remote IP camera. After days of research, I was able to extract JPEG images from MJPEG live stream with sample codes I got.
I did a prototype with using Windows Form. With Windows Form, I receive appropriately 80 images every 10 second from the IP camera.
Now I ported the code to Unity3D and I get about 2 frames every 10 seconds.
So basically about 78 Images are not received. The thing looks like medieval PowerPoint slide show.
I am running the function in new Thread just like I did in the Windows Form. I first thought that the problem in Unity is because I was displaying the image, but it wasn't.
I removed the code that displays the Image as a texture and used an integer to count the number of images received. Still, I get about 2 to 4 images every 10 seconds. Meaning in the Windows Form App, I get about 80 to 100 images every 10 seconds.
Receiving 2 images in 10 seconds in Unity is unacceptable for what I am doing. The code I wrote doesn't seem to be the problem because it works great in Windows Form.
Things I've Tried:
I though the problem is from the Unity3D Editor run-time, so I called for Windows 10 64bit and ran it but that didn't solve the problem.
Changed the Scripting Backend from Mono2x to IL2CPP but the problem still remains.
Changed the Api compatibility Level from .NET 2.0 to .NET 2.0 Subset and nothing changed.
Below is a simple function I that is having that problem. It runs too slow on Unity even though I called it from another thread.
bool keepRunning = true;
private void Decode_MJPEG_Images(string streamTestURL = null)
{
keepRunning = true;
streamTestURL = "http://64.122.208.241:8000/axis-cgi/mjpg/video.cgi?resolution=320x240"; //For Testing purposes only
// create HTTP request
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(streamTestURL);
// get response
WebResponse resp = req.GetResponse();
System.IO.Stream imagestream = resp.GetResponseStream();
const int BufferSize = 5000000;
byte[] imagebuffer = new byte[BufferSize];
int a = 2;
int framecounter = 0;
int startreading = 0;
byte[] start_checker = new byte[2];
byte[] end_checker = new byte[2];
while (keepRunning)
{
start_checker[1] = (byte)imagestream.ReadByte();
end_checker[1] = start_checker[1];
//This if statement searches for the JPEG header, and performs the relevant operations
if (start_checker[0] == 0xff && start_checker[1] == 0xd8)// && Reset ==0)
{
Array.Clear(imagebuffer, 0, imagebuffer.Length);
//Rebuild jpeg header into imagebuffer
imagebuffer[0] = 0xff;
imagebuffer[1] = 0xd8;
a = 2;
framecounter++;
startreading = 1;
}
//This if statement searches for the JPEG footer, and performs the relevant operations
if (end_checker[0] == 0xff && end_checker[1] == 0xd9)
{
startreading = 0;
//Write final part of JPEG header into imagebuffer
imagebuffer[a] = start_checker[1];
System.IO.MemoryStream jpegstream = new System.IO.MemoryStream(imagebuffer);
Debug.Log("Received Full Image");
Debug.Log(framecounter.ToString());
//Display Image
}
//This if statement fills the imagebuffer, if the relevant flags are set
if (startreading == 1 && a < BufferSize)
{
imagebuffer[a] = start_checker[1];
a++;
}
//Catches error condition where a = buffer size - this should not happen in normal operation
if (a == BufferSize)
{
a = 2;
startreading = 0;
}
start_checker[0] = start_checker[1];
end_checker[0] = end_checker[1];
}
resp.Close();
}
Now I am blaming HttpWebRequest for this problem. Maybe it was poorly implemented in Unity. Not sure....
What's going on? Why is this happening? How can I fix it?