Long delay before seeing video when AVPlayer creat

2019-02-15 18:38发布

问题:

When playing a video exported from a AVAssetExportSession, you hear audio long before seeing video. Audio plays right away, but video only appears after the recording loops several times (i.e., starts and finishes). In other words, you hear audio from the video multiple times before seeing any images.

We are using AutoLayout on iOS 8.

Using the following test, we isolated the problem to exportAsynchronouslyWithCompletionHandler. In both code blocks, we play an existing video -- not one related to the export -- so the export process has been eliminated as a variable.

Code 1 plays both video & audio at the start whereas Code 2 only plays audio at the start and shows video after a delay of 10-60 seconds (after the video loops several times).

The only difference between the two code blocks is one uses exportAsynchronouslyWithCompletionHandler to play the video while the other one does not.

Help? Is it possible the audio gets exported first and is ready to play before the video? Something to do with the export happening on a different thread?

func initPlayer(videoURL: NSURL) {
    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

    // Get notified when video done for looping purposes
    NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem)

    // Log status
    println("Initialized video player: \(CMTimeGetSeconds(asset.duration)) seconds & \(asset.tracks.count) tracks for \(videoURL)")
}

func playExistingVideo() {
    let filename = "/ChopsticksVideo.mp4"
    let allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
    let docsPath = allPaths[0] as! NSString
    let exportPath = docsPath.stringByAppendingFormat(filename)
    let exportURL = NSURL.fileURLWithPath(exportPath as String)!

    initPlayer(exportURL)
}

Code 1:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    playExistingVideo()

Code 2:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    // -- Export video
    exporter.exportAsynchronouslyWithCompletionHandler({
        self.playExistingVideo()
    })

回答1:

I'm going to suggest that the problem is here:

    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

You see, when you create an AVPlayer from a video URL, it comes into the world not yet ready to play. It can usually start playing audio quite quickly, but video takes longer to prepare. This could explain the delay in seeing anything.

Well, instead of waiting for the video to be ready, you are just going ahead and saying play() immediately. Here's my suggestion. What I suggest you do is what I explain in my book (that's a link to the actual code): create the player and the layer, but then set up KVO so that you are notified when the player is ready to display, and then add the layer and start playing.

Also, I have one more suggestion. It seems to me that there is a danger that you are running that code, setting up your interface (with the layer) and saying play(), on a background thread. That is certain to cause delays of various kinds. You seem to be assuming that the completion handler from exportAsynchronouslyWithCompletionHandler: is being called on the main thread - and you are going straight ahead and calling the next method and so proceeding to set up your interface. That's a very risky assumption. In my experience you should never assume that any AVFoundation completion handler is on the main thread. You should be stepping out to the main thread with dispatch_async in your completion handler and proceeding only from there. If you look at the code I linked you to, you'll see I'm careful to do that.



回答2:

For those who stumble upon this question later, the answer was in the comments of the accepted answer. Key dispatch_async part below:

[exporter exportAsynchronouslyWithCompletionHandler:^(void){
    dispatch_async(dispatch_get_main_queue(), ^{
        switch (exporter.status) {
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"Video Merge Successful");
                break;
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Failed:%@", exporter.error.description);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Canceled:%@", exporter.error);
                break;
            case AVAssetExportSessionStatusExporting:
                NSLog(@"Exporting!");
                break;
            case AVAssetExportSessionStatusWaiting:
                NSLog(@"Waiting");
                break;
            default:
                break;
        }
    });

}];