解读XMP元数据在ALAssetRepresentation(Interpret XMP-Metad

2019-07-03 16:57发布

当用户进行一些更改(裁剪,消除红眼,...)在内置Photos.app iOS上,所做的更改不会应用到照片fullResolutionImage通过相应的返回ALAssetRepresentation

然而,变化应用到thumbnailfullScreenImage由返回ALAssetRepresentation 。 此外,关于所施加的变化的信息,可以在发现ALAssetRepresentation的元数据字典经由键@"AdjustmentXMP"

我想这些更改应用到fullResolutionImage自己保持一致性。 我发现,在iOS6的+ CIFilterfilterArrayFromSerializedXMP: inputImageExtent:error:可以在此XMP元数据转换成数组CIFilter的:

ALAssetRepresentation *rep; 
NSString *xmpString = rep.metadata[@"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];

CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];

NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData 
                                             inputImageExtent:image.extent 
                                                        error:&error];
if (error) {
     NSLog(@"Error during CIFilter creation: %@", [error localizedDescription]);
}

CIContext *context = [CIContext contextWithOptions:nil];

for (CIFilter *filter in filterArray) {
     [filter setValue:image forKey:kCIInputImageKey];
     image = [filter outputImage];
}

然而,这仅适用于一些过滤器(裁剪,自动增强),但不是为别人像去除红眼。 在这些情况下, CIFilter ■找没有明显的效果。 因此,我的问题:

  • 是任何人都知道的一种方式来创建红眼消除CIFilter ? (与Photos.app一致的方式,用钥匙将其过滤kCIImageAutoAdjustRedEye是不够的。例如,它不带参数的眼睛的位置。)
  • 是否有产生和iOS 5下应用这些过滤器的可能性?

Answer 1:

ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];

// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0  length:representation.size error:nil];

if (length==0)
    return nil;

// Convert the buffer into a NSData object, and free the buffer after.

NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];

// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).

// Specify the source hint.

NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:

(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];

// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.

CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata,  (CFDictionaryRef) sourceOptionsDict);

[adata release];

CFDictionaryRef imagePropertiesDictionary;

// Get a copy of the image properties from the CGImageSourceRef.

imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);

CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);

CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);

int w = 0;

int h = 0;

CFNumberGetValue(imageWidth, kCFNumberIntType, &w);

CFNumberGetValue(imageHeight, kCFNumberIntType, &h);

// Clean up memory

CFRelease(imagePropertiesDictionary);


文章来源: Interpret XMP-Metadata in ALAssetRepresentation