My Windows UWP project reads and displays each frame from a USB webcam, and then it sends it by TCP socket to my IPAD running my Xamarin.IOS app.
But they don't display in the iPAD UIImage control. (If I programmatically set the UIImage to a .jpg then that displays ok)
Here is the code that reads each camera frame from a USB camera...
void FrameReader_FrameArrived(MediaFrameReader sender, MediaFrameArrivedEventArgs args)
{
// TryAcquireLatestFrame will return the latest frame that has not yet been acquired.
// This can return null if there is no such frame, or if the reader is not in the
// "Started" state. The latter can occur if a FrameArrived event was in flight
// when the reader was stopped.
using (var frame = sender.TryAcquireLatestFrame())
{
if (frame != null)
{
var renderer = _frameRenderers[frame.SourceKind];
renderer.ProcessFrame(frame);
…
public void ProcessFrame( MediaFrameReference frame){
softwareBitmap = FrameRenderer.ConvertToDisplayableImage(frame?.VideoMediaFrame);
SoftwareBitmap latest_frame_SoftwareBitmap;
var imageSource = (SoftwareBitmapSource)_imageElement.Source;
await imageSource.SetBitmapAsync( latest_frame_SoftwareBitmap);
latest_frame_SoftwareBitmap.CopyToBuffer( Frames_To_iPad.buffer_to_IOS.AsBuffer
buffer_to_IOS sent by socket to Xamarin.IOS app running on iPad.
}
}
}
}
Here is the code that is supposed to display each frame within a Xamarin.IOS UIKit.UIImage control...
ImageView_camera.Image = UIImage.LoadFromData(NSData.FromArray( Camera_Socket_Client.packet_frame_state.buffer )); // DISPLAYS NOTHING
// SIMPLE TEST...
ImageView_camera.Image = UIImage.FromBundle("good_photo_of_Doug.jpg"); // DISPLAYS OK