React native: Real time camera data without image

2019-06-02 22:35发布

问题:

I started working on my first non-demo react-native app. I hope it will be a iOS/Android app, but actually I'm focused on iOS only.

I have a one problem actually. How can I get a data (base64, array of pixels, ...) in real-time from the camera without saving to the camera roll.

There is this module: https://github.com/lwansbrough/react-native-camera but base64 is deprecated and is useless for me, because I want a render processed image to user (change picture colors eg.), not the real picture from camera, as it does react-native-camera module.

(I know how to communicate with SWIFT code, but I don't know what the options are in native code, I come here from WebDev)

Thanks a lot.

回答1:

This may not be optimal but is what I have been using. If anyone can give a better solution, I would appreciate your help, too!

My basic idea is simply to loop (but not simple for-loop, see below) taking still pictures in yuv/rgb format at max resolution, which is reasonably fast (~x0ms with normal exposure duration) and process them. Basically you will setup AVCaptureStillImageOutput that links to you camera (following tutorials everywhere) then set the format to kCVPixelFormatType_420YpCbCr8BiPlanarFullRange (if you want YUV) or kCVPixelFormatType_32BGRA(if you prefer rgba) like

bool usingYUVFormat = true;
NSDictionary *outputFormat = [NSDictionary dictionaryWithObject:
                [NSNumber numberWithInt:usingYUVFormat?kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:kCVPixelFormatType_32BGRA]
                                           forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[yourAVCaptureStillImageOutput setOutputSettings:outputFormat];

When you are ready, you can start calling

AVCaptureConnection *captureConnection=[yourAVCaptureStillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[yourAVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection  completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
    if(imageDataSampleBuffer){
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
        // do your magic with the data buffer imageBuffer
        // use CVPixelBufferGetBaseAddressOfPlane(imageBuffer,0/1/2); to get each plane
        // use CVPixelBufferGetWidth/CVPixelBufferGetHeight to get dimensions
        // if you want more, please google
    }
}];

Additionally, use NSNotificationCenter to register your photo-taking action and post a notification after you have processed each frame (with some delay perhaps, to cap your through-put and reduce power consumption) so the loop will keep going.

A quick precaution: the Android counterpart is much worse a headache. Few hardware manufacturers implement api for max-resolution uncompressed photos but only 1080p for preview/video, as I have raised in my question. I am still looking for solutions but gave up most hope. JPEG images are just toooo slow.