I have a simple Android media player that can play multiple videos simultaneously on a single screen. So basically a single media player screen is divided into 4 parts, with 4 mediaPlayer
instance glued together, and each part plays a given video.
It works almost OK when my video files are stored locally on the device. There are synchronization problems, but minor. But when I input a URL for HTTP streaming, there is significant synchronization problems. What is the problem? Generally, how can I remove the synchronization problems?
The only thing I could do was first instantiate the mediaplayers and prepare()
them, then call start()
one after the other so at least the start times be close to eachother. It doesn't have much effect though.
Here I have a method that return each of the mediaplayer
instances:
MediaPlayer mediaPreparation(String filename, boolean setMute) {
String url = "myURL"; // your URL here
// create mediaplayer instance
MediaPlayer mediaPlayer = new MediaPlayer();
if (setMute) {
mediaPlayer.setVolume(0, 0);
}
try {
mediaPlayer.setDataSource(url);
mediaPlayer.prepare();
} catch (IOException e) {
}
mediaPlayer.setLooping(true);
// mediaPlayer.start();
return mediaPlayer;
}
And then I start them one by one:
mp[0].start();
mp[1].start();
mp[2].start();
mp[3].start();
I'm not sure if any Android media player offers this functionality.
I suspect there may be device dependencies also, as different devices may have different capabilities in the HW to decode and play multiple videos, and if some of your videos have to use SW decoding etc they will be much slower.
It may not meet your needs, but a common way to provide a grid of videos like this on an end device is to merge the videos together on the server side and deliver it to the device as a single video stream.
Update
One other thing to be aware of if using MediaCodec and leveraging the HW codecs - if the videos have different video profiles this can cause different decoding latency also.
This is to do with how the videos are encoded - in simple terms if a particular frame refers to information from a frame that comes after it (a common compression approach) then the decoder needs to buffer the frame until it has the refereed to frame also. Simpler compression approaches, for using Baseline profile, do not use this technique so don't have to buffer and hence may have lower latency. This appears to be different for different HW vendors also - see this note from Intel, in particular the low latency section at the end:
I suspect the best approach to this particular aspect is to aim for the lowest common dominator - either only use Baseline profile or else try to delay all video display by some factor longer than the maximum latency you can expect from any individual video.
In streaming cases, there is always a risk of data being not continuously available, so players buffer quite a few frames before start playing. And in this case, multiple streams might take different time to get buffered for sufficient quantity. I see one way you can try, mediacodec. Refer this, https://developer.android.com/reference/android/media/MediaCodec.html.
Go through particularly, releaseOutputBuffer() and its variants. You have more control over rendering (alter the timestamp if required, though I won't advice as playback won't be smooth). You can keep track of whether all 4 instances got a particular timestamped frame decoded or not and then render them at once.