We have a video streaming platform where users can broadcast a live video stream and synchronise it with a set of presentation slides. To display the broadcast on iOS we are using HTTP Live Streaming. In order to show the slide at the correct time in the stream on iOS we were listening for the qt_timedmetadataupdated
event provided by Apple's Quicktime Javascript API. This method is described here:
However, in iOS8 this method no longer works so we are trying to find an alternative solution.
Does anyone have an idea as to how we could do this?
The only bit of progress I've managed to make is checking for an "in-band metadata text track" as described here:
https://github.com/videojs/videojs-contrib-hls#in-band-metadata
I've set up an example test page below using flowplayer and the flashls plugin:
http://jsbin.com/vohicoxegi/1/edit?html,js,output
In the code I've created an interval that checks every 500ms whether a text track exists whose kind
property is metadata
. I've noticed that when a bit of timed metadata is injected into the stream then one of these text tracks is created. But the problem is that there is no way for me to access the data that is in the timed metadata which I need to synchronise the (previously mentioned) slides correctly.
Please note that I'm only concerned with live streaming. Playing an existing media file is not a problem.
I think the text tracks are the way to go. I used qt_timedmetadataupdated before as well and got this working nicely on ios8 like this:
Iron Mike's solution was nearly correct. When a track event comes through you have to set its
mode
property tohidden
otherwise thecuechange
events won't fire. Here's a full example: