The best camera is the one you have with you, they say. It's no surprise then that smartphone manufacturers are pushing the envelope with camera technology. In the bid for a better shot, the new focus is on... focus. Let's find out how technology is changing autofocus in smartphone cameras.

Why Focus On Focus?

If you're wondering why they are concentrating on focus, it's mainly because megapixels don't matter, as much as advertisers would like you to believe otherwise. In the end, the user is only concerned with the final image, and improving on the focus technology draws the eye to the subject better, which makes the photo seem sharper. Of course, smartphones are still mainly used by amateur photographers, so it's all about improving autofocus technology, since fixed focal length result in blurry images while manual focus is not possible on smartphones yet.

Different phone makers have different ideas about how to improve autofocus. And knowing what these technologies do could make a big difference when deciding your next smartphone. So let's look at how autofocus technology works right now, and how it is improving.

Contrast Detection

  • Used in: Most smartphone cameras today
  • Good for: Situations where your subject is stationary, like someone posing or a landscape scene.
  • Bad for: Moving subjects; low light environments
Contrast-detection

Most smartphone cameras use contrast detection, either exclusively or in conjunction with another focus technology. In that sense, contrast detection is the base autofocus technology.

Contrast detection is reliant on light conditions. The better lit the scene is, the better it will work. In this type of autofocus, the camera looks at the scene from nearest to farthest points and analyzes pixels. The microprocessor compares pixels to find the point with the maximum difference in contrast i.e. the difference between the "whiteness" and "blackness". By doing this across the scene, it will figure out what to focus on.

In the process of going from nearest to farthest, the camera actually crosses the point of optimum focus, and then has to come back to it. When taking a photo, have you seen your screen go from blurry to focussed to blurry, and back again to perfect focus? That's what contrast detection is.

The speed of contrast detection is dependant on the light conditions and the microprocessor of your camera. If light is good, the point of contrast is clearer to "see"; if the microprocessor is fast, the camera figures out the point of contrast faster.

However, given the nature of the process, it's the slowest of all current autofocus technologies. It's also why this is difficult to use when you or your subject is moving.

Laser Autofocus

  • Used in: LG G3, LG G4, OnePlus 2, ASUS Zenfone 2 Laser
  • Good for: Taking in-focus photos quickly, or taking photos in low light
  • Bad for: Landscape photos, or when the subject is far away from the camera
LG-G3-laser-autofocus

The OnePlus 2 and the LG G4 (read our review) have been touting their "Laser Focus" cameras in advertising. What does this mean and why is it better than standard cameras?

The camera unit of these phones is equipped with a laser transmitter and receiver. The phone will basically emit an infrared laser, which bounces off the subject and returns to the receiver. The receiver notes down the time taken by the laser to return. The speed of the laser is constant. Speed is distance divided by time. And with quick, simple math, the camera knows the distance to the subject. Just like that, the subject will be brought into focus, and the image is taken. It's similar to how LIDAR is used in the science of self-driving cars.

Lasers are extremely fast and the whole process happens almost instantly. In fact, of all the autofocus technologies, this is the quickest, but it needs manual input from you. In terms of where to focus, you have to tap the screen, and that's where the laser will roughly travel. The upside is that tapping on the screen also takes a photo. Combine the two and you have instant, in-focus photos of moving subjects, like this LG demo shows:

Also, since the camera is generating its own light (the infrared laser), the light conditions of your environment don't matter. The laser will bounce and come back, regardless of how dark it is. Laser autofocus is the best of the current technologies for low-light photos.

The downside? The lasers on these cameras are pretty weak (so as not to harm the subject if it accidentally hits the eye). So generally, it works only when the subject is close enough for the laser to bounce back. That means when you're after a lovely photo of a beautiful lake with mountains in the background, laser focus is useless. Thankfully, all laser autofocus smartphones currently use both laser focus and contrast detection; so when the laser focus can't be used, the camera revert to contrast detection.

Phase Detection

  • Used in: iPhone 6 and 6 Plus, Samsung Galaxy S5 and above, upcoming Sony phones
  • Good for: Continuously refocussing a scene when a subject is moving
  • Bad for: Burst shots focussing on different subjects, low-light environments
passive-detection-iphone-6

Samsung, Sony and Apple are all betting on Phase Detection as the best autofocus technology to complement standard contrast detection. Again, like with laser autofocus, both phase detection and contrast detection are used in tandem.

Phase detection is far too complex to explain succinctly, so we will skip over some finer details for a broad overview of how it works. The lens of a camera is curved. When the camera sees a scene, the image on the rightmost part of this curve is compared with the image on the leftmost part of this curve. Now, both these images are a little blurred, since they are not perfectly in focus. The sensor calculates the difference of blurriness between these two images and figures out where the two will meet—thus knowing the point of best focus.

The technology works much faster than standard contrast detection. Apple refers to their version of phase detection as "Focus Pixels", and we loved it in our iPhone 6 Plus review, but it's the same technology. YouTube user Blunty shows how Apple's phase detection autofocus is superior when compared to the iPhone 5s with standard contrast detection:

The problems with phase detection are similar to the problems with contrast detection. Photos taken in low-light environments, or of things where the difference in contrast isn't high, will appear blurry.

Also, phase detection is fine when the background is more or less constant and the subject moves - like the video above. But it's not quick to focus when the background and subject are both changing - like trying to take a photo of a kid running across the park, with the background changing from monkey bars to trees to people and so on.

Dual Cameras

Dual-Camera-Huawei-Honor-6-Plus

Some smartphones have two cameras on the back, like the HTC One M8. One camera captures the photo with standard contrast detection autofocus for the subject. The second camera's job is to capture light in all directions, thus getting objects at different depths from the one the subject is on.

The purpose of this dual-camera setup is to let you shoot now and focus later. The idea is the same as the famed Lytro light-field camera. Each photo that your phone takes has two cameras snapping at the same time. Whenever you want, you can go back to this photo and tap on different areas of the scene to focus on a different object.

It's not exactly autofocus though. What you get is more like a bokeh effect, a way of artistically smoothening light which you can even emulate with some camera hacks and filters. The effect does look far better with the Dual Camera setup though.

While the ability to focus later is great, the base technology used in your regular shot is still contrast detection -- and that means it's not the most accurate focus you will get, nor can you use it to continuously track a subject and take its photo.

Which Is the Best Autofocus?

Much like how an octa-core isn't always better than a quad-core, phase detection isn't always better than laser autofocus or dual camera isn't always better than phase detection, and so on.

The best thing you can do before you purchase a new smartphone is to try it out, and ask users what they think of their camera's autofocus.

So, if you use a phone with phase detection, laser focus, or dual cameras, tell us how it has made a difference for you, and what you like or dislike about it.

Image credit: FirmBee / Pixabay