编辑发现这个代码,前置摄像头图像帮助http://blog.logichigh.com/2008/06/05/uiimage-fix/
希望其他人也有类似的问题,才能够帮助我。 还没有找到一个解决办法。 (这似乎有点长,但只是一堆助手代码)
我使用的是IOS面临从画廊从摄像头(正面和背面)获得性图像以及图像检测器(我使用的是UIImagePicker
-从画廊的相机和图像选择两个图像采集-不使用avfoundation在squarecam演示拍照等)
我正在真的搞砸了坐标检测(如果有的话),所以我写了一个简短的调试方法来获得的面孔,以及那对他们绘制一个正方形实用的范围,我想应该检查哪一个方位的检测是工作:
#define RECTBOX(R) [NSValue valueWithCGRect:R]
- (NSArray *)detectFaces:(UIImage *)inputimage
{
_detector = \[CIDetector detectorOfType:CIDetectorTypeFace context:nil options:\[NSDictionary dictionaryWithObject:CIDetectorAccuracyLow forKey:CIDetectorAccuracy\]\];
NSNumber *orientation = \[NSNumber numberWithInt:\[inputimage imageOrientation\]\]; // i also saw code where they add +1 to the orientation
NSDictionary *imageOptions = \[NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation\];
CIImage* ciimage = \[CIImage imageWithCGImage:inputimage.CGImage options:imageOptions\];
// try like this first
// NSArray* features = \[self.detector featuresInImage:ciimage options:imageOptions\];
// if not working go on to this (trying all orientations)
NSArray* features;
int exif;
// ios face detector. trying all of the orientations
for (exif = 1; exif <= 8 ; exif++)
{
NSNumber *orientation = \[NSNumber numberWithInt:exif\];
NSDictionary *imageOptions = \[NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation\];
NSTimeInterval start = \[NSDate timeIntervalSinceReferenceDate\];
features = \[self.detector featuresInImage:ciimage options:imageOptions\];
if (features.count > 0)
{
NSString *str = \[NSString stringWithFormat:@"found faces using exif %d",exif\];
\[faceDetection log:str\];
break;
}
NSTimeInterval duration = \[NSDate timeIntervalSinceReferenceDate\] - start;
NSLog(@"faceDetection: facedetection total runtime is %f s",duration);
}
if (features.count > 0)
{
[faceDetection log:@"-I- Found faces with ios face detector"];
for(CIFaceFeature *feature in features)
{
CGRect rect = feature.bounds;
CGRect r = CGRectMake(rect.origin.x,inputimage.size.height - rect.origin.y - rect.size.height,rect.size.width,rect.size.height);
[returnArray addObject:RECTBOX(r)];
}
return returnArray;
} else {
// no faces from iOS face detector. try OpenCV detector
}
[1]
尝试吨不同的图片后,我注意到人脸检测器的方向是不与摄像机图像属性是一致的。 我把一堆照片从前置摄像头,其中的UIImage取向为3(查询imageOrienation),但脸部检测器没有找到该设置面。 当通过所有的EXIF可能性运行中,脸部检测终于拿起面孔但对于不同的方位都在一起。
![1]: http://i.stack.imgur.com/D7bkZ.jpg
我该如何解决这个问题? 有没有在我的代码错了吗?
另一个问题我有,(但与面部检测器紧密相连),当面部检测器拾取的面,但对“错误”取向(主要是发生的前置摄像头在) UIImage
最初用于显示正确的uiiimageview,但当我画一个正方形叠加(我在我的应用程序中使用OpenCV的,所以我决定将转换UIImage
到CvMat中来绘制OpenCV的覆盖)整个图像旋转90度(仅CvMat中的图像,而不是UIImage
我最初显示)
我能想到的在这里的理由是,人脸检测与一定的缓冲搞乱(背景?)的UIImage的转换OpenCV的垫使用。 我怎么能单独这些缓冲区?
转换的UIImage CvMat中来的代码是(从“著名” UIImage
类别有人做):
-(cv::Mat)CVMat
{
CGColorSpaceRef colorSpace = CGImageGetColorSpace(self.CGImage);
CGFloat cols = self.size.width;
CGFloat rows = self.size.height;
cv::Mat cvMat(rows, cols, CV_8UC4); // 8 bits per component, 4 channels
CGContextRef contextRef = CGBitmapContextCreate(cvMat.data, // Pointer to backing data
cols, // Width of bitmap
rows, // Height of bitmap
8, // Bits per component
cvMat.step[0], // Bytes per row
colorSpace, // Colorspace
kCGImageAlphaNoneSkipLast |
kCGBitmapByteOrderDefault); // Bitmap info flags
CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), self.CGImage);
CGContextRelease(contextRef);
return cvMat;
}
- (id)initWithCVMat:(const cv::Mat&)cvMat
{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];
CGColorSpaceRef colorSpace;
if (cvMat.elemSize() == 1)
{
colorSpace = CGColorSpaceCreateDeviceGray();
}
else
{
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGImageRef imageRef = CGImageCreate(cvMat.cols, // Width
cvMat.rows, // Height
8, // Bits per component
8 * cvMat.elemSize(), // Bits per pixel
cvMat.step[0], // Bytes per row
colorSpace, // Colorspace
kCGImageAlphaNone | kCGBitmapByteOrderDefault, // Bitmap info flags
provider, // CGDataProviderRef
NULL, // Decode
false, // Should interpolate
kCGRenderingIntentDefault); // Intent
self = [self initWithCGImage:imageRef];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return self;
}
-(cv::Mat)CVRgbMat
{
cv::Mat tmpimage = self.CVMat;
cv::Mat image;
cvtColor(tmpimage, image, cv::COLOR_BGRA2BGR);
return image;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
self.prevImage = img;
// self.previewView.image = img;
NSArray *arr = [[faceDetection sharedFaceDetector] detectFaces:img];
for (id r in arr)
{
CGRect rect = RECTUNBOX(r);
//self.previewView.image = img;
self.previewView.image = [utils drawSquareOnImage:img square:rect];
}
[self.imgPicker dismissModalViewControllerAnimated:YES];
return;
}