I've been playing around with the camera plugin, I know there's the possibility to capture a video via the CameraController
with the functions start/stopVideoRecording
, the functions will take a path for a file as an input.
I'd like to be able to stream this video to a server while it's being recorded, is it possible somehow with the current capabilities of the camera plugin?
Looks like it is possible as it was done at Flutter Live with one of the development demos. Check out the YouTube video here . at 24:17. There is a method on the CameraController called getByteStream. The gist of the method shown below.
void cameraBytesToDetector({@required CameraController camera}){
camera.startByteStream( (image) {
// do something with the image stream here
});
}
But I can find no reference anywhere as I am also looking for a way to read the video stream.
You can call Java/Kotlin or Objective-C/Swift libraries from Dart.
I can suggest that libraries for Android,
https://github.com/begeekmyfriend/yasea
https://github.com/ant-media/LiveVideoBroadcaster
The only thing that you have to do is developing a dart interface to this kind of libraries.