I have some problems with using NSURLSession
to upload photos from Asset Library to the server.
At first NSURLSession
doesn't support streaming upload. I got an exception when trying to using that:
@property (nonatomic, strong) NSURLSession *uploadSession;
...
_uploadSession = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration
backgroundSessionConfiguration:kUploadBackgroundURLSessionIdentifier] delegate:self delegateQueue:nil];
...
NSURLSessionUploadTask *task = [self.uploadSession uploadTaskWithStreamedRequest:URLRequest];
This is an exception:
Terminating app due to uncaught exception 'NSGenericException', reason: 'Upload tasks in background sessions must be from a file'
That's really strange because Apple's manual doesn't contain any information about using only uploadTaskWithRequest:fromFile:
for background session. What if I would like to upload really huge video file from Asset Library? Should I save it previously to my tmp directory?
Looks like the only reason is to use uploadTaskWithRequest:fromFile:
anyway, right? But then I have a question how server get to know what's part of the file is uploading right now if uploading process was interrupted and started to upload next part in background?
Should I manage something for that? Previously I used Content-Range for that in URL request if I wanted to continue upload part of the file which was started previously. Now I can't do that - I have to create an URL Request before creating upload task and looks like NSURLSession
have to do something like that automatically for me?
Does anyone do something like that already? Thanks
Convert to NSData and copy and write in app folder
ALAsset *asset = [cameraRollUploadImages objectAtIndex:startCount];
ALAssetRepresentation *representation = [asset defaultRepresentation];
// create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte *)malloc(representation.size);// copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer
fromOffset:0
length:representation.size
error:nil];
// convert the buffer into a NSData object, free the buffer after
NSData *image = [[NSData alloc] initWithBytesNoCopy:buffer
length:representation.size
freeWhenDone:YES];
Right now, there is no way otherthan saving the picture to local file system or temp directory.
Following code make sure your data won't be lost with exif tags. (ALAsset => NSData)
ALAssetRepresentation *assetRepresentation = [(ALAsset *)assetBeingUploaded defaultRepresentation];
uint8_t *buffer = (uint8_t *)malloc(sizeof(uint8_t)*[assetRepresentation size]);
NSUInteger buffered = 0;
if (buffer != NULL)
buffered = [assetRepresentation getBytes:buffer fromOffset:0.0 length:assetRepresentation.size error:nil];
self.imageBeingUploaded = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
The upload task in background session doesn't support completion handler. We should go for.,
- (NSURLSessionUploadTask *)uploadTaskWithRequest:(NSURLRequest *)request fromFile:(NSURL *)fileURL;
I doubt how do we get the response headers or body in case if we are using background session & uploadtask with request using file?
A clean workaround will be to create a NSOperation which will copy the file from the asset library to your temporary folder using NSStream, so you won't get a crash in case of a huge file, when the operation completes you schedule a upload of that temporary file, when the upload finishes you delete it.
In my case, i need to send the file in multipart format so the creation of the temporary file is necessary but i encounter a problem in uploading large files, more then 2 Gb, example movies over 20 minutes.
You can't upload NSData in background you need to upload file format. You can create it by directory path