I am developing a webrtc based video chat app, currently the video call is working, but I want to record a video from the remote video stream using the VideoFileRenderer, there are many implementations of the interface for example: https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java this is the implementation I am using. It saves the video to the file with no problem but I can only play it with desktop after using a codec because the file is .y4m not .mp4 and when I try to play it using VideoView it says that it can't play the video, even if I try to play the video with the videoPlayer that comes with the android it can't play it, I can only play it using MXPlayer, VLC, or any other application that has codecs in desktop.
to simplify the question:
How can I play video.y4m on native android VideoView?
I will simplify it more, I will assume that I don't understand the format of the recorded file, here is the code I am using to record the file:
When start recording:
remoteVideoFileRenderer = new VideoFileRenderer(
fileToRecordTo.getAbsolutePath(),
640,
480,
rootEglBase.getEglBaseContext());
remoteVideoTrack.addSink(remoteVideoFileRenderer);
When finish recording:
remoteVideoFileRenderer.release();
Now the question again: I have a "fileToRecordTo" and this video file can be played on GOM(windows), VLC(windows, mac and Android), MXPlayer(Android) but I can't neither play it using the player that comes embedded with the Android(if worked, I would have used this player in my app) nor on Android native videoView.
any help.
Video only recording
I had a similar case in my project. At first, I tried WebRTC's default VideoFileRenderer but the video size was too big because no compression is applied. I found this repository. It really helped in my case. https://github.com/cloudwebrtc/flutter-webrtc
Here is a step by step guide. I've also made some adjustments.
Add this class to your project. It has lots of options to configure the final video format.
Now on your Activity/Fragment class
Declare a variable of the above class
When you receive the stream you want to record(remote or local) you can initialize the recording.
When the call session ends, you need to stop and release the recording.
This is enough to record the video but without audio.
Video & Audio recording
To record local peer's audio you need to consume this class(https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc/RecordedAudioToFileController.java). But first you need to setup an AudioDeviceModule object}
Merge audio and video
Add this dependency
Then add this piece to your code when your call finishes. Make sure that video and audio recording are stopped and released properly.
I know this isn't the best solution for recording audio and video in an Android WebRTC video call. If someone knows how to extract audio using WebRTC please add a comment.