MPMoviePlayerController MovieAccessLogEvent - Infl

2020-04-08 06:41发布

I am currently working with the MPMoviePlayerController and am analysing metrics for video playback. Specifically, analysing adaptive bitrates.

As part of testing I load a particular rendition of the video at a fixed bitrate (995kbps), however when reading from the observedBitrate property of my MPMovieAccessLogEvent, this value is much more inflated - to the tune of around 15mbps.

Is there any known reason why this bitrate being returned is considerably higher than that of the playback? I have double checked all values, and all playback, and it is definitely the observedBitrate that is inflated.

According to the documentation, this value is:

The empirical throughput across all media downloaded for the movie player, in bits per second.

Update

I posted this question on the developer forums and have received an answer, which is still just conjecture but thought it might aid the question anyway and maybe provoke a better answer.

https://devforums.apple.com/thread/216659?tstart=0

It would be worth checking your HLS video with mediastreamvalidator which will download and measure your segment bit rates.

1条回答
冷血范
2楼-- · 2020-04-08 07:21

There is a simple answer to this - the indicatedBitrate of a MPMovieAccessLogEvent (or AVPlayerItemAccessLogEvent for AVPlayer) is the bitrate from the current playlist, so is an average bitrate required to play the stream.

However, the observedBitrate is NOT averaged - it is the instantaneous bitrate (or download speed) which the player achieved while downloading a particular chunk of video.

Example: Playing a playlist with a 1000 Kb/s stream, in chunks of 10 seconds each. The device can achieve over 10MB/s download over WiFi, so it takes less than 1 second to download each chunk. Therefore, the player is downloading at over 10,000 Kb/s during each chunk. I'd expect the player to return (approximately) these values:

indicatedBitrate: 1000 Kb/s

observedBitrate: 10,000 Kb/s

I'd been mystified by these large values myself, but I think this explains it.

This is just for illustration - these values are not very meaningful since we don't really know how long it takes to download a chunk, or indeed how big each chunk is. All the observedBitrate really tells you is how well the player is managing to keep up with the bitrate needed to play the stream. If the former is 10x bigger than the latter, then it is only using 10% of the available time to download each chunk. This ratio may be used as a quality-of-service indicator. For example, if the observedBitrate is less than the indicatedBitrate then it is very likely that the player will stall due to buffering, but as long as it is greater, then all is well and the stream is likely to play smoothly.

查看更多
登录 后发表回答