I'm using GPUImage in my application and trying to filter video. Live video filtering is working well. The trouble comes up when I try to read a video into memory from the filesystem and apply filters using the code posted on the sunsetlakessoftware tutorial page and in the SimpleVideoFileFilter demo.
EDIT: I realized that my original post may not have been posing a specific enough question. What I am asking is: How exactly can I read a video from disk into memory, apply a GPUImageFilter, and then overwrite the original with the filtered version?
The application is crashing with the following error:
-[AVAssetWriter startWriting] Cannot call method when status is 2
status 2 is AVAssetWriterStatusCompleted
. I've seen the same failure occur with all three other AVAssetWriterStatus
es.
I have posted the relevant code below.
GPUImageFilter *selectedFilter = [self.allFilters objectAtIndex:indexPath.item];
// get the file url I stored when the video was initially captured
NSURL *url = [self.videoURLsByIndexPath objectForKey:self.indexPathForDisplayedImage];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:url];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
[movieFile addTarget:selectedFilter]; // apply the user-selected filter to the file
unlink([url.absoluteString UTF8String]); // delete the file that was at that file URL so it's writeable
// A different movie writer than the one I was using for live video capture.
GPUImageMovieWriter *editingMovieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:url size:CGSizeMake(640.0, 640.0)];
[selectedFilter addTarget:editingMovieWriter];
editingMovieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = editingMovieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:editingMovieWriter];
[editingMovieWriter startRecording];
[movieFile startProcessing]; // Commenting out this line prevents crash
// weak variables to prevent retain cycle
__weak GPUImageMovieWriter *weakWriter = editingMovieWriter;
__weak id weakSelf = self;
[editingMovieWriter setCompletionBlock:^{
[selectedFilter removeTarget:weakWriter];
[weakWriter finishRecording];
[weakSelf savePhotosToLibrary]; // use ALAssetsLibrary to write to camera roll
}];
Perhaps my issue is with the scope of the editingMovieWriter. Or perhaps the fact that I am initializing a GPUImageMovie instance with the same URL that I am attempting to write to. I have read several posts on the issues page of the GPUImage github, several related posts on SO, the readme, and the tutorial linked above.
Any insight into this issue would be greatly appreciated. Thanks.
There's at least one thing here that might be behind this. In the code above, you're not hanging on to a strong reference to your movieFile
source object.
If this is an ARC-enabled project, that object will be deallocated the instant that you complete your setup method (if it's not, you'll be leaking that object). That will stop movie playback, deallocate the movie itself, and lead to black frames being sent down the filter pipeline (other potential instabilities).
You need to make movieFile
a strongly-referenced instance variable to make sure it hangs on past this setup method, since all movie processing is asynchronous.
Here is solution :
Declare it
var movieFile: GPUImageMovie!
var gpuImage: GPUImagePicture!
var sourcePicture: GPUImagePicture!
var sepiaFilter: GPUImageOutput!
var sepiaFilter2: GPUImageInput!
var movieWriter : GPUImageMovieWriter!
var filter: GPUImageInput!
//Filter image
func StartWriting()
{
// Step - 1 pass url to avasset
let loadingNotification = MBProgressHUD.showHUDAddedTo(self.view, animated: true)
loadingNotification.mode = MBProgressHUDMode.Indeterminate
loadingNotification.labelText = "Loading"
let documentsURL1 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
let pathToMovie = documentsURL1.URLByAppendingPathComponent("temp.mov")
self.movieFile = GPUImageMovie(URL: pathToMovie)
self.movieFile.runBenchmark = true
self.movieFile.playAtActualSpeed = false
self.filter = GPUImageGrayscaleFilter()
self.sepiaFilter = GPUImageGrayscaleFilter()
self.movieFile.addTarget(self.filter)
let documentsURL2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
self.paths = documentsURL2.URLByAppendingPathComponent("temp1.mov")
var fileManager: NSFileManager = NSFileManager.defaultManager()
var error: NSError
fileManager.removeItemAtURL(self.paths, error: nil)
let Data = NSData(contentsOfURL: pathToMovie)
println( Data?.length )
var anAsset = AVAsset.assetWithURL(pathToMovie)as!AVAsset
var videoAssetTrack = anAsset.tracksWithMediaType(AVMediaTypeVideo)[0]as! AVAssetTrack
var videoAssetOrientation_: UIImageOrientation = .Up
var isVideoAssetPortrait_: Bool = true
var videoTransform: CGAffineTransform = videoAssetTrack.preferredTransform
var naturalSize = CGSize()
var FirstAssetScaleToFitRatio: CGFloat = 320.0 / videoAssetTrack.naturalSize.width
println(naturalSize)
naturalSize = videoAssetTrack.naturalSize
self.movieWriter = GPUImageMovieWriter(movieURL: self.paths, size: naturalSize)
let input = self.filter as! GPUImageOutput
input.addTarget(self.movieWriter)
self.movieWriter.shouldPassthroughAudio = true
if anAsset.tracksWithMediaType(AVMediaTypeAudio).count > 0 {
self.movieFile.audioEncodingTarget = self.movieWriter
}
else
{
self.movieFile.audioEncodingTarget = nil
}
self.movieFile.enableSynchronizedEncodingUsingMovieWriter(self.movieWriter)
self.movieWriter.startRecording()
self.movieFile.startProcessing()
self.movieWriter.completionBlock =
{() -> Void in
self.movieWriter.finishRecording()
self.obj.performWithAsset(self.paths)
}
let delayTime1 = dispatch_time(DISPATCH_TIME_NOW, Int64(15 * Double(NSEC_PER_SEC)))
dispatch_after(delayTime1, dispatch_get_main_queue()) {
MBProgressHUD.hideAllHUDsForView(self.view, animated: true)
}
hasoutput = true ;
}