Pinterest Stumbleupon Whatsapp
Ads by Google

The best camera is the one you have with you, they say. It’s no surprise then that smartphone manufacturers are pushing the envelope with camera technology. In the bid for a better shot, the new focus is on… focus. Let’s find out how technology is changing autofocus in smartphone cameras.

Why Focus On Focus?

If you’re wondering why they are concentrating on focus, it’s mainly because megapixels don’t matter What Is A Megapixel? What Is A Megapixel? Megapixels are one of the most common ways of advertising the quality of cameras, especially relatively low-end cameras aimed at the mass market likes the ones in typical smartphones. Read More , as much as advertisers would like you to believe otherwise. In the end, the user is only concerned with the final image, and improving on the focus technology draws the eye to the subject better, which makes the photo seem sharper. Of course, smartphones are still mainly used by amateur photographers, so it’s all about improving autofocus technology, since fixed focal length result in blurry images while manual focus is not possible on smartphones yet.

Different phone makers have different ideas about how to improve autofocus. And knowing what these technologies do could make a big difference when deciding your next smartphone. So let’s look at how autofocus technology works right now, and how it is improving.

Contrast Detection

  • Used in: Most smartphone cameras today
  • Good for: Situations where your subject is stationary, like someone posing or a landscape scene.
  • Bad for: Moving subjects; low light environments

Contrast-detection

Most smartphone cameras use contrast detection, either exclusively or in conjunction with another focus technology. In that sense, contrast detection is the base autofocus technology.

Contrast detection is reliant on light conditions. The better lit the scene is, the better it will work. In this type of autofocus, the camera looks at the scene from nearest to farthest points and analyzes pixels. The microprocessor compares pixels to find the point with the maximum difference in contrast i.e. the difference between the “whiteness” and “blackness”. By doing this across the scene, it will figure out what to focus on.

Ads by Google

In the process of going from nearest to farthest, the camera actually crosses the point of optimum focus, and then has to come back to it. When taking a photo, have you seen your screen go from blurry to focussed to blurry, and back again to perfect focus? That’s what contrast detection is.

The speed of contrast detection is dependant on the light conditions and the microprocessor of your camera. If light is good, the point of contrast is clearer to “see”; if the microprocessor is fast, the camera figures out the point of contrast faster.

However, given the nature of the process, it’s the slowest of all current autofocus technologies. It’s also why this is difficult to use when you or your subject is moving.

Laser Autofocus

  • Used in: LG G3, LG G4, OnePlus 2, ASUS Zenfone 2 Laser
  • Good for: Taking in-focus photos quickly, or taking photos in low light
  • Bad for: Landscape photos, or when the subject is far away from the camera

LG-G3-laser-autofocus

The OnePlus 2 and the LG G4 (read our review LG G4 Review and Giveaway LG G4 Review and Giveaway The LG G4 refines what was great about the G3, and it manages to stand out in a crowded market thanks to a great camera, removable battery, gorgeous screen, and optional genuine leather back cover. Read More ) have been touting their “Laser Focus” cameras in advertising. What does this mean and why is it better than standard cameras?

The camera unit of these phones is equipped with a laser transmitter and receiver. The phone will basically emit an infrared laser, which bounces off the subject and returns to the receiver. The receiver notes down the time taken by the laser to return. The speed of the laser is constant. Speed is distance divided by time. And with quick, simple math, the camera knows the distance to the subject. Just like that, the subject will be brought into focus, and the image is taken. It’s similar to how LIDAR is used in the science of self-driving cars How Self-Driving Cars Work: The Nuts and Bolts Behind Google's Autonomous Car Program How Self-Driving Cars Work: The Nuts and Bolts Behind Google's Autonomous Car Program Being able to commute back and forth to work while sleeping, eating, or catching up on your favorite blogs is a concept that is equally appealing and seemingly far-off and too futuristic to actually happen. Read More .

Lasers are extremely fast and the whole process happens almost instantly. In fact, of all the autofocus technologies, this is the quickest, but it needs manual input from you. In terms of where to focus, you have to tap the screen, and that’s where the laser will roughly travel. The upside is that tapping on the screen also takes a photo. Combine the two and you have instant, in-focus photos of moving subjects, like this LG demo shows:

Also, since the camera is generating its own light (the infrared laser), the light conditions of your environment don’t matter. The laser will bounce and come back, regardless of how dark it is. Laser autofocus is the best of the current technologies for low-light photos.

The downside? The lasers on these cameras are pretty weak (so as not to harm the subject if it accidentally hits the eye). So generally, it works only when the subject is close enough for the laser to bounce back. That means when you’re after a lovely photo of a beautiful lake with mountains in the background, laser focus is useless. Thankfully, all laser autofocus smartphones currently use both laser focus and contrast detection; so when the laser focus can’t be used, the camera revert to contrast detection.

Phase Detection

  • Used in: iPhone 6 and 6 Plus, Samsung Galaxy S5 and above, upcoming Sony phones
  • Good for: Continuously refocussing a scene when a subject is moving
  • Bad for: Burst shots focussing on different subjects, low-light environments

passive-detection-iphone-6

Samsung, Sony and Apple are all betting on Phase Detection as the best autofocus technology to complement standard contrast detection. Again, like with laser autofocus, both phase detection and contrast detection are used in tandem.

Phase detection is far too complex to explain succinctly, so we will skip over some finer details for a broad overview of how it works. The lens of a camera is curved. When the camera sees a scene, the image on the rightmost part of this curve is compared with the image on the leftmost part of this curve. Now, both these images are a little blurred, since they are not perfectly in focus. The sensor calculates the difference of blurriness between these two images and figures out where the two will meet—thus knowing the point of best focus.

The technology works much faster than standard contrast detection. Apple refers to their version of phase detection as “Focus Pixels”, and we loved it in our iPhone 6 Plus review iPhone 6 Plus Review and Giveaway iPhone 6 Plus Review and Giveaway The 5.5-inch iPhone 6 Plus is Apple's latest and perhaps strangest addition to its range of tablets and smartphones. Read More , but it’s the same technology. YouTube user Blunty shows how Apple’s phase detection autofocus is superior when compared to the iPhone 5s with standard contrast detection:

The problems with phase detection are similar to the problems with contrast detection. Photos taken in low-light environments, or of things where the difference in contrast isn’t high, will appear blurry.

Also, phase detection is fine when the background is more or less constant and the subject moves – like the video above. But it’s not quick to focus when the background and subject are both changing – like trying to take a photo of a kid running across the park, with the background changing from monkey bars to trees to people and so on.

Dual Cameras

Dual-Camera-Huawei-Honor-6-Plus

Some smartphones have two cameras on the back, like the HTC One M8. One camera captures the photo with standard contrast detection autofocus for the subject. The second camera’s job is to capture light in all directions, thus getting objects at different depths from the one the subject is on.

The purpose of this dual-camera setup is to let you shoot now and focus later. The idea is the same as the famed Lytro light-field camera Lytro Light Field Camera: Snap Happy Or Photo Gimmick? Lytro Light Field Camera: Snap Happy Or Photo Gimmick? Described by an employee as "the first major change in photography since photography was invented", the Lytro light-field camera is certainly a revolutionary device. The camera shakes things up by replacing much of the heavy... Read More . Each photo that your phone takes has two cameras snapping at the same time. Whenever you want, you can go back to this photo and tap on different areas of the scene to focus on a different object.

It’s not exactly autofocus though. What you get is more like a bokeh effect, a way of artistically smoothening light which you can even emulate with some camera hacks and filters 8 Useful Digital Camera Hacks That Don't Cost The Earth 8 Useful Digital Camera Hacks That Don't Cost The Earth You can actually get more out of a photograph with a cheap homemade reflector than an expensive lens when you are on the learning curve with your camera. The good thing is that the photography... Read More . The effect does look far better with the Dual Camera setup though.

While the ability to focus later is great, the base technology used in your regular shot is still contrast detection — and that means it’s not the most accurate focus you will get, nor can you use it to continuously track a subject and take its photo.

Which Is the Best Autofocus?

Much like how an octa-core isn’t always better than a quad-core Is an Octa-Core Better than a Quad-Core? Not Always! Android Processors Explained Is an Octa-Core Better than a Quad-Core? Not Always! Android Processors Explained More cores don't necessarily mean a faster processor. Read More , phase detection isn’t always better than laser autofocus or dual camera isn’t always better than phase detection, and so on.

The best thing you can do before you purchase a new smartphone is to try it out, and ask users what they think of their camera’s autofocus.

So, if you use a phone with phase detection, laser focus, or dual cameras, tell us how it has made a difference for you, and what you like or dislike about it.

Image credit: FirmBee / Pixabay

  1. Jason Roof
    June 27, 2016 at 6:40 pm

    Hey just an fyi... The laser focus on the LG phone cameras do not operate the way you are saying, it doesn't send out the laser and receive it back using the constant speed of light to back calculate the distance.

    It scatters a single laser beam into numerous tiny points of light over the scene of the picture, the camera can then use any of these light points as a spot to focus on depending on where you tap the camera screen to focus to.... Focusing on a small point on the subject is just as useful as focusing on the actual subject.

  2. mpepe3
    May 9, 2016 at 8:01 am

    Hi everyone,

    Thank you for the post, it is really interesting to know the different technolgies behind the cameras. However, one detail is missing, manual focus is possible to get it using smartphones, for example, the Nokia Lumia 1020 and other models of Lumia series, can be focused by means of manual mode.

    I think from the artistic point of view, it is very important to take in account this feature.

    I have the lumia 1020 and this feature is really useful, one the big limitations for this camera is the closest distance to get in focus an object. Nevertheless, hope this comment will help you.

  3. Israr Khan
    September 5, 2015 at 3:20 pm

    makeuseof is a very useful site.It provides with a lot of useful information.

    • Mihir Patkar
      September 5, 2015 at 4:09 pm

      Thanks for the kind words, Israr! :)

  4. Courtney Stacey
    September 4, 2015 at 3:08 pm

    I have the LG G4 and am constantly amazed at this camera. On a recent trip, my expensive DSLR quit working. I barely noticed. The photos I shot with the G4 were amazing. I took mostly mountain landscapes, kids shots, and groups snaps; but that is a pretty good variety... I used the stock camera app in every mode from simple to manual... and I did try several third party apps, like Google camera, Camera FX Pro, HDR Camera, and Vignette. the results were surprisingly good all around... there is still a small but noticeable shutter lag, plus low light and long exposure are iffy.... then there is that pesky fact of a fixed lens... So, I won't be giving up my DSLR (once it is fixed or replaced), especially in professional situations (I occasionally shoot weddings and events). When I really want to take that best shot I will be certain to bring my trusty Sony Alpha... but that said, in all other day to day situations I do think it will only be a rare time when I am disappointed that the "Camera I Have with Me" is the G4.

    • Mihir Patkar
      September 4, 2015 at 3:29 pm

      That's great to hear, Courtney! Can you shed some light on the laser focus in particular? Do you notice it, has it helped?

Leave a Reply

Your email address will not be published. Required fields are marked *