So i've been using UIImagepickercontroller to access the camera for photo and video capture, then i wanted to apply filters on those 2 sources, i succeeded with filtering token photos but i'am having trouble finding the solution for the rest, all i need is to access the raw image data : the live image feed that the camera is showing , apply the filter and then show the filtered ones instead. Any help or advice will be appreciated.
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
UIImagePickerController doesn't give you low level access to the camera buffer.
You should setup a AVCaptureSession and use the delegate to process the CMSampleBufferRef
Take a look at the AVCam & SquareCam demos from Apple, they give a good introduction to video capture.
http://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html
An easier solution is to use https://github.com/BradLarson/GPUImage
Thanks Adam