I am using AVPlayer to play .m3u8 file.
Using AVAssetImageGenerator to extract image out of it using following code :
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:mp.contentURL options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
img = [[UIImage alloc] initWithCGImage:oneRef];
It always gives me error :
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7fb4e30cbfa0 {NSUnderlyingError=0x7fb4e0e28530 "The operation couldn’t be completed. (OSStatus error -12782.)", NSLocalizedFailureReason=An unknown error occurred (-12782), NSLocalizedDescription=The operation could not be completed}
It works for mp4,mov and all major video extensions URL but not for m3u8. Any idea??
Your problem is only to be expected. .m3u8 Files are not actual Asset files, rather, they are more akin to a playlist. They are used in HTTP Live Streaming and provide a location for "Segments" based on available bandwidth.
here is an example of a .m3u8 file (Apple's sample .m3u8 file)
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=200000
gear1/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=311111
gear2/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=484444
gear3/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=737777
gear4/prog_index.m3u8
Unfortunately, you can not create an AVAsset or AVURLAsset to represent the media in a HTTP Live stream. Reference:Apple's reference example of Asset Loading/playing
You won't be able to get still images for a live stream using AVAssetImageGenerator. Instead, you can use
AVPlayerItemVideoOutput
With AVPlayerItemVideoOutput you can get an image that is appropriate to display at a specified time for a given .m3u8 stream using the following method: - (CVPixelBufferRef)copyPixelBufferForItemTime:(CMTime)itemTime itemTimeForDisplay:(CMTime *)outItemTimeForDisplay
Then, you can convert the returned CVPixelBufferRef into an image (or other) for display.
our finding is that, if you play a HLS stream which has "I-Frame only playlist", for example stream “https://tungsten.aaplimg.com/VOD/bipbop_adv_example_v2/master.m3u8”(has I-frame only playlist), AVAssetImageGenerator could generate the requested image one by one.
But please notice that "it is only fine on iOS8.X and iOS9.X", but failed at iOS10.X.
I have committed a bug report to Apple Bug Reporter.