On Apple's iOS 6.0 feature page, it used to say
Take advantage of the built-in camera’s advanced features. New APIs let you control focus, exposure, and region of interest. You can also access and display faces with face detection APIs, and leverage hardware-enabled video stabilization.
This text has since been removed, and I can't find new methods in the API for controlling exposure. In class AVCaptureDevice
under "Exposure Settings" there is no new property/method for iOS 6.0. Do you know where i can find new features for exposure in API?
As a follow-up to Michael Grinich's excellent information, I found that there is an order dependency on some of the calls in the private API. To use "manual" exposure controls, you have to enable them before you set the mode, like so:
All of this is demonstrated in iOS-ManualCamera.
Starting with iOS 8.0, this is now finally possible.
See setExposureModeCustomWithDuration etc. in the Apple documentation.
Here is an article discussing how to use the APIs.
It's true that there is an
-exposureMode
property onAVCaptureDevice
, but that's only for setting the mode (off/auto/continuous) and not the actual f-stop, SS, or ISO. Camera apps that provide "exposure" control all seem to do it through post-processing.However, it seems there are undocumented APIs in the framework to do this. Check out the full headers for
AVCaptureDevice.h
(via a class-dump) and note the following methods:My guess is
gain
is equivalent f-stop (fixed aperture), andduration
is shutter speed. I wonder if these are used for the iPhone 5's low-light boost mode.You can also use
otool
to poke around and try to piece together the symbols. There's likely a new constant inexposureMode
for enabling manual control, andexposureDuration
seems like it has flags too. When calling these, make sure to use the new-isExposureModeSupported:
and also call-respondsToSelector:
to check compatibility.As always, using private APIs is frowned upon by Apple and is cause for rejection from the App Store. There might be ways around this, such as hiding the calls using
-performSelector:
orobc_msgsend
with rot13 strings or something, as I'm pretty sure they only do static analysis on the app binary.It looks like they've updated that linked text—there's no mention of new APIs for exposure:
There is an opt-in low-light boost mode for iPhone 5, detailed here by Jim Rhoades (and in this developer forum post, log-in required).
I've managed to 'trick' the camera into running a shorter exposure time, but I suspect it will only be of use to those doing similar (macro) image acquires. I first set up AVCaptureDevice to use AVCaptureExposureModeContinuousAutoExposure and set the flash to TorchMode. I then UnlockForConfiguration and set up a key-value observer to watch for adjustingExposure to finish. I then re-lock the device, flip to AVCaptureExposureModeLocked, and turn off the Torch. This has the effect of brute-force setting a shorter shutter speed than what the camera would select on the un-illuminated scene. By playing with the Torch level I can set any relative shutter speed value I want (it would be best of course to leave the torch on, but in my application it produces glare on the subject). Again this only really works when your object distance is very close (less than say 6 inches), but it's allowed me to eliminate hand shake blurring in my close-up images. The down side is that the images are darker since I don't have a way of spoofing the camera gain, but not a problem in my particular application.