I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage
. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer
without any problems.
What have I tried:
AVAssetImageGenerator
. It is not working, the methodcopyCGImageAtTime:actualTime: error:
returns null image ref. According to the answer hereAVAssetImageGenerator
doesn't work for streaming videos.- Taking snapshot of the player view. I tried first
renderInContext:
onAVPlayerLayer
, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -drawViewHierarchyInRect:afterScreenUpdates:
which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. AVPlayerItemVideoOutput
. I have added a video output for myAVPlayerItem
, however whenever I callhasNewPixelBufferForItemTime:
it returnsNO
. I guess the problem is again streaming video and I am not alone with this problem.AVAssetReader
. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
AVAssetImageGenerator
is the best way to snapshot a video, this method return asynchronously aUIImage
:(It's Swift 4.2)
AVPlayerItemVideoOutput
works fine for me from an m3u8. Maybe it's because I don't consulthasNewPixelBufferForItemTime
and simply callcopyPixelBufferForItemTime
? This code produces aCVPixelBuffer
instead of aUIImage
, but there are answers that describe how to do that.This answer mostly cribbed from here