How can I capture an image when AVPlayer playing m

2019-02-18 09:28发布

问题:

I use AVPlayer to play a m3u8 file, and I want to capture an image in these code:

AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:self.player.currentItem.asset];
gen.appliesPreferredTrackTransform = YES;
NSError *error = nil;
CMTime actualTime;
CMTime now = self.player.currentTime;
[gen setRequestedTimeToleranceAfter:kCMTimeZero];
[gen setRequestedTimeToleranceBefore:kCMTimeZero];
CGImageRef image = [gen copyCGImageAtTime:now actualTime:&actualTime error:&error];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
NSLog(@"%f , %f",CMTimeGetSeconds(now),CMTimeGetSeconds(actualTime));

NSLog(@"%@",error);
if (image) {
    CFRelease(image);
}

but it does not work. And the error is:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7fadf25f59f0 {NSUnderlyingError=0x7fadf25f1670 "The operation couldn’t be completed. (OSStatus error -12782.)", NSLocalizedFailureReason=An unknown error occurred (-12782), NSLocalizedDescription=The operation could not be completed}

How can I solve it?
Thanks a lot.

回答1:

AVAssetImageGenerator may require local assets. Maybe you'd have more luck adding an AVPlayerItemVideoOutput to your AVPlayer, seeking to the desired spot and calling copyPixelBufferForItemTime:itemTimeForDisplay: on the videoOutput.



回答2:

I solved the same problem with you using the following code.

You can use this code:

Properties

@property (strong, nonatomic) AVPlayer *player;
@property (strong, nonatomic) AVPlayerItem *playerItem;
@property (strong, nonatomic) AVPlayerItemVideoOutput *videoOutput;

Initial

AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
self.player = [AVPlayer playerWithPlayerItem:_playerItem];

Getting image

CMTime currentTime = _player.currentItem.currentTime;
CVPixelBufferRef buffer = [_videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
UIImage *image = [UIImage imageWithCIImage:ciImage];
//Use image^^


回答3:

To capture image from avplayer of HLS video:

private let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])

private let jpegCompressionQuality = 0.7

private func imageFromCurrentPlayerContext() {
    guard let player = player else { return }
    let currentTime: CMTime = player.currentTime()

    guard let buffer: CVPixelBuffer = videoOutput.copyPixelBuffer(forItemTime: currentTime, itemTimeForDisplay: nil) else { return }
    let ciImage: CIImage = CIImage(cvPixelBuffer: buffer)
    let context: CIContext = CIContext.init(options: nil)

    guard let cgImage: CGImage = context.createCGImage(ciImage, from: ciImage.extent) else { return }
    let image: UIImage = UIImage.init(cgImage: cgImage)

    guard let jpegImage: Data = UIImageJPEGRepresentation(image, jpegCompressionQuality) else { return }
    // be happy
}