Android L Camera App UI, Features and New API for Developers

July 15, 2014

Google has announced it upcoming totally revamped mobile operating system, Android L. The new operating system comes with lots of improvements,  some of them are related to the camera app user interface, features and API which allows developers to control many parts of the camera functionality from code.

Android L Camera App UI

Android L has gone through a massive redesign and freshen up it’s user interface look with what Google called “Material Design”. The changes are made for every UI element, including typography, iconography, colors and so on. This also applied to the camera app as well.

In the lock screen, you can already see the camera icon at the bottom-right corner, just hold and slide it to the left to launch the camera app. This allows you quick access to the camera at times that you quickly need to take an important shot. At the main camera app screen, slide your finger from the left side towards the middle to reveal the default camera functions, which include: Photo Sphere, Panorama, Lens Blur (will talk more about it later on), Camera (the camera app itself), Video (to capture video) and a Settings icons at the bottom-right side.

Clicking on the ‘settings ‘ icon will bring the settings window where you can choose from:

  • Resolution & Quality (resolution for the rear and front camera if available, video resolution, panorama resolution, lens blur quality)
  • Save Location (o/off switch, saves the geo-location data with the photo)
  • Advanced (Manual exposure on/off switch)
  • About (build version, Open source licenses, send feedback)

So what each of the camera modes does?

  • Photo Sphere – hold the device in a fixed position and tilt or turn it to capture a 360 degrees photosphere images
  • Panorama – as the name suggests, allows you to shoot panoramic images by moving the camera in a sweep movement from one side to another to capture an image with a much larger field of view
  • Lens Blur – make an image with a subject that has a much more prominent blurry background effect, like what you get with a large-sensor camera and a fast lens. To get the best of it, you should make sure that the subject or the surrounding are far as possible behind the subject.

Once the image has been taken, you have the option to edit and share it in playback mode. This include deleting the image, apply special effect, increase or decrease the lens blur effect using a slider (yes, event after you’ve taken the image)

Google will probably continue to work on the app and improve it in various ways, so it might look a bit different than what you see from this video posted by Tech1Tv YouTube user.

Android L Camera 2 API (for developers)

No doubt that simplicity is where all the mobile operating systems are striving for, and Android L UI is no different.  Google want developers to make amazing apps that utilize the mobile phone camera, and although the default camera app is simplistic by design it offers advanced manual controls that offer full control over the exposure, allowing developers with the new Camera 2 API to control and adjust the shutter speed (exposure time), tonemap curve, tweak the auto-exposure and auto WB, capture raw sensor data, chaneg the ISO sensitivity, burst speed and so on.

Here’s a full list of the available manual controls for the Camera2 API for Android L:

  • Exposure time
  • ISO sensitivity
  • Frame duration
  • Lens focus distance
  • Flash trigger
  • Color correction matrix
  • JPEG metadata
  • Tonemap curve
  • Crop region
  • HDR+ (less blurry image, less noise and improved exposure)
And the Auto-mode controls:
  • AE/AF/AWB mode
  • AE/AWB lock
  • AF trigger
  • Precapture AE trigger
  • Metering regions
  • Exposure compensation
  • Target FPS range
  • Capture intent
  • Video stabilization

The new API now supports faster burst speed. With the Nexus 5 for example, the Camera2 API can shoot 30 fps in 8MP resolution compared to the older API that shot at 1-3fps at 8MP resolution, so everything is much more optimized for high performance. In fact, in the previous API the manual controls settings were in a separate layer, and now with the new Camera 2 API the settings are part of the image, allowing much faster processing. So the camera will be able to fully process the images at about the same speed the phone can capture them.

The developer has access to various camera characteristic information, including lens focal length, sensor color space and dynamic range, sensor dimensions, device capabilities, flash state, lens state, natural color point, faces detected, etc.  So with the new API, developers can develop more advanced and engaging camera app that improves upon the default camera app that comes with each Android device. I personally really excited about the ability to capture the sensor’s RAW support (aka digital negative).

The Android L with the new Camera 2 API brings and UI changes bring an enhanced multimedia experience that will certainly take any Android L -based device camera experience to a new level — a DSLR level?

I also wanted to talk more about the Lens Blur feature. This feature is one of my favorite ones, as it allows photographers the ability to capture subjects with very shallow depth of field effect. This is done with a Google technology that uses a computer vision algorithms to create a 3D depth map.  Then the camera uses the depth map (black means close subject, white means a far subject, with different tones indicating how far it is) and the original photo to produce an image that simulates the shallow depth of field effect. More information about the new lens blur feature in Google Camera app can be found here.

It’s worth mentioning that the problem starts with the fact that mobile phone cameras, at least most of them, have a small sensor, and that leads to a deep depth of field. So objects further away from the camera still appear very sharp. Sometimes that exactly what you want, but for certain type of photos like portrait/selfies or object with distracting background, this is not desired. In order to allow mobile photographers to also enjoy this effect as DSLR photographers do, it did it digitally instead of optically and the result are not less convincing than what you achieve with a DSLR. In fact, you can achieve a lens blur effect in much a very high degree, something that you can only achieve with a camera with a large sensor and very fast lens.

I already eager to see how the new Android L OS will run on current and future device, and I am very happy to see how good the improvement are for the camera app in particular and for the Android OS in general.

Amazon Ads