What determines the surface texture width and heig

2019-09-07 01:51发布

问题:

Here's a snippet from my implementation of my custom camera page.

For some reason, I keep getting a lower resolution textureView/SurfaceTexture height and width than I expect. I want to keep the native maximum camera resolution (or picture size) my native camera on my phone takes normally. However, with the way I have it setup right now, the resolution of my textureView is lower than the native camera resolution. I'm wondering why? I tried to debug and seems OnSurfaceTextureAvailable() seems to be setting the width and height incorrectly. Where is it grabbing it from?

public class CameraPage : PageRenderer, TextureView.ISurfaceTextureListener, Android.Views.View.IOnTouchListener 
{
    global::Android.Hardware.Camera camera;

    Activity activity;
    CameraFacing cameraType;
    TextureView textureView;
    SurfaceTexture surfaceTexture;

    public void OnSurfaceTextureAvailable (SurfaceTexture surface, int width, int height)
    {
        GetCameraInstance();

        if (camera != null) {
            textureView.LayoutParameters = new FrameLayout.LayoutParams(width, height);
            surfaceTexture = surface;

            camera.SetPreviewTexture(surface);

            PrepareAndStartCamera();
        }
    }

    private void GetCameraInstance()
    {
        try {
            camera = global::Android.Hardware.Camera.Open((int)CameraFacing.Back);
        }
        catch (Exception e) {
            //ignore any exception
        }
    }

    public bool OnSurfaceTextureDestroyed (SurfaceTexture surface)
    {
        StopCameraPreviewAndRelease();
        return true;
    }

     private void StopCameraPreview()
    {
        try {
            if (camera != null)
                camera.StopPreview();
        }
        catch { }
    }

    private void StopCameraPreviewAndRelease()
    {
        try {
            if (camera != null) {
                StopCameraPreview();
                camera.SetPreviewCallback(null);
                camera.Release();
                camera = null;
            }
        }
        catch { }
    }

    public void OnSurfaceTextureSizeChanged (SurfaceTexture surface, int width, int height)
    {
        PrepareAndStartCamera ();
    }

    public void OnSurfaceTextureUpdated (SurfaceTexture surface)
    {

    }

    private void PrepareAndStartCamera ()
    {
            var flashMode = GetFlashMode();
            SetCameraParameters(flashMode);

            StopCameraPreview();

            var display = activity.WindowManager.DefaultDisplay;
            if (display.Rotation == SurfaceOrientation.Rotation0) {
                camera.SetDisplayOrientation (90);
            }

            if (display.Rotation == SurfaceOrientation.Rotation270) {
                camera.SetDisplayOrientation (180);
            }

            if (flashOn)
                toggleFlashButton.SetBackgroundResource(Resource.Drawable.flash_on);
            else
                toggleFlashButton.SetBackgroundResource(Resource.Drawable.flash_off);

            camera.StartPreview ();
    }
}

This is how I'm setting my textureView:

textureView = view.FindViewById<TextureView> (Resource.Id.textureView);
textureView.SurfaceTextureListener = this;

textureView in my cameraLayout:

<TextureView
        android:id="@+id/textureView"
        android:layout_marginTop="-110dp"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:backgroundTint="#99b4d1ff"
        android:layout_marginLeft="0dp" />

That is, I'm expecting 2576x1932 resolution when the photo is taken and saved. But I am getting 1451x720 instead when the photo is taken and saved. Seems like that was determined to be the textureView size (but I want the native camera resolution size).

EDIT: Here's the way the photo is being taken of:

private async void TakePhotoButtonTapped (object sender, EventArgs e)
{
    try{
        try
        {
            StopCameraPreview();
        }
        catch (Exception ex) {
            camera.Reconnect();
            PrepareAndStartCamera();
            StopCameraPreview();
        }

        var image = textureView.Bitmap;

        var imageQuality = AppState.ApplicationInfo.AndroidImageCompressionFactor;
        using (var imageStream = new MemoryStream ()) {
            await image.CompressAsync(Bitmap.CompressFormat.Jpeg, imageQuality, imageStream);
            image.Recycle();
            imageBytes = imageStream.ToArray ();
        }
        count +=1;

        textView.Text = Convert.ToString(count);
        _images.Add(imageBytes);
        camera.StartPreview ();
    }
    catch(Exception ex)
    {
    }
}