Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.
Thinking about buying the Google Pixel smartphone? Beware that the Pixel conceals a secret: artificial intelligence (AI). While Google advertised a lot of ground-breaking features, it also stuffed in a few unadvertised secrets. So what lurks inside of the Pixel?
In short, you get pocket-AI, cloud storage, and more. But let’s start with what Google advertised in its first promotional video:
According to Google’s marketing department, the Pixel comes with the world’s best smartphone camera, a 3.5″ audio jack, and the mysterious Google Assistant.
Artificial Intelligence Comes to Smartphones
AI drives the Pixel’s uniqueness. Before 2016, AI and smartphones remained separate from one another. But thanks to Google’s acquisition of three AI firms and Motorola’s ATAP division, the road seems paved toward pocket AI. With the Pixel, civilization hurtles toward a future in which AI fits in your pocket. And that learning machine listens to everything you do.
Modern AI, such as that found in the Pixel, relies on a technology known as known as machine learning. Google’s machine learning techniques relate to their acquisition of the AI research firm, DeepMind. DeepMind developed AlphaGo — the AI that beat Go champion Lee Sedol. And that means a technology capable of outsmarting a human may now inhabit your pocket. But how has Google adapted AI to fit inside of the Pixel?
The Pixel’s upgraded personal assistant, Google Assistant, is smart. Like AlphaGo, it learns from observing human behavior. Based on how you use the Pixel, the assistant recommends actions. For example, using telemetry data, it calculates when you leave for work, forecasting traffic and suggesting alternate routes. Other times, Assistant peeks at your photo roll and suggests stitching together bursts of pictures, creating a stabilized image. Sometimes it even automatically applies stylish filters to photos.
Here’s an example of an automatically applied filter:
Check out this automatically-created animated GIF:
Assistant’s self-learning abilities include (at least) five categories. If enabled, Assistant displays related notifications (Google refers to these as Cards) whenever it senses that the user needs them. These kinds of cards include:
- Weather — Detects your region and forecasts weather for the day.
- Work Commute — Detects driving conditions, alternative routes, and determines an estimated time of arrival.
- Next Meeting — Assistant can also hook into your calendar and sense upcoming meetings.
- Reminders — If you ever set a reminder using Assistant, Google creates a virtual card, which displays itself at a specified time or place.
- News — Based on your particular news interests, such as sports or technology, Assistant pulls up relevant cards whenever it detects you need them.
Ignore the marketing. Assistant will come to all Google products. The reasons range from the practical to the frightening. For the most part, Google wants Assistant to offer conversational capabilities. And training a neural net requires tremendous amounts of raw data and user telemetry.
It Knows What You’re Talking About
Another Google Assistant feature worth mentioning: contextual processing. Older personal assistants work great for asking a single question. But they’re terrible for continuing a dialog. Assistant closes the gap between a conversant AI and one-shot assistants by adding context to issues.
For example, if I ask Assistant for directions to the supermarket and then correct myself by saying “I meant to the nearest bank,” Assistant places that question within the context of the conversation. It then intuits that the user wants to issue a navigation correction. The process feels seamless and without the stutter of the previous generation of personal assistants.
To date, only a few Google products include Assistant. That’s the Pixel, Allo, and Google Home. A few other apps, such as the Photo app, include Assistant as a feature. However, it’s unclear as to the extent of its integration.
Free Cloud Storage and Smart Storage
Google furnishes the Pixel with free and unlimited cloud storage for photo backups. Combined with Smart Storage, you can take an unlimited number of photographs without running out of space. It works like this: based on how you’ve configured your Pixel, your phone might delete images every 30, 60, or 90 days — but only if your photos have been backed up. Plugging your phone into a power source triggers the backup process.
Configuring Smart Storage
To configure Smart Storage, navigate to Settings and the choose Storage. In the Storage menu, you can toggle both the backup frequency and whether or not a backup occurs. Unfortunately, Google does not integrate (nor does it permit integration) with a third-party cloud storage solution like OneDrive.
Because Smart Storage links into Google Photos, that means you get unlimited photo backups. Unlike other smartphones, the Pixel comes with full resolution quality. The unlimited storage that comes with other services usually means the photos are compressed (what’s compression?) and of diminished quality.
One of the creepiest features in Google Pixel is its photo app. Once you snap a picture of friends, Google uploads your photos to the cloud, and its AI applies a facial recognition algorithm. Each and every person in your album can receive a tag (or name). You can search your database for all pictures of a specified individual.
But that also means Google keeps a catalog of not just your identity, but the identities of your friends (and random strangers who blundered into a shot). The potential consequences remain unknown — but the risk of abuse chills even the staunchest of technology advocates.
If getting access to a machine assistant wasn’t enough, Google also borrowed a feature from the Amazon Fire Phone: direct access to a human assistant. If your Pixel ever malfunctions — or you just need some advice — a human operator can help with just the touch of a button.
Hidden Pixel Features
In addition to AI, the Pixel throws in a few hidden features, which the users needs to enable or discover. These include gesture support, a red screen-tinting feature, and a hidden notification red-green-blue (RGB) LED light.
Night Mode (AKA Night Light)
The Lighting Research Center’s study on light exposure found that red light reduces insomnia. Fortunately, Android 7.0 added the ability to automatically redden display color at night. Unfortunately, Google chose to remove this feature in Android 7.1.1 — except for Pixel users. It’s silly that Google pulls useful features from Android, but it helps differentiate their product from their (now) competitors.
Keep in mind that Night Mode requires activation. Turning it on is a simple process.
First, go to Settings and select Display. Then go to Night Light and turn it on. You can also configure the hours in which the display tints amber. However, the default settings should match your body’s natural circadian rhythms.
The Hidden RGB LED Light
Among its unadvertised features, the Pixel throws in a secret LED for notifications. Users can enable this feature, which saves on both battery life and screen lifespan. To permit it, the user must navigate to Settings > Notifications and click on the gear icon in the upper-right side of the screen. Then enable the Pulse notification light. From then on, whenever users receive a notification, the LED blinks.
I haven’t figured out any of the reasons why the light flashes different colors. It seems that redder light indicates important notifications.
Gesture or Motion Activated Features
Google co-opted motion and gesture controls from Motorola — they refer to it as Move. The gesture controls allow the user to launch certain apps or features by physically moving the phone. Move includes, as of 2016, five different actions: fingerprint scanner gestures, jump-to-camera, twist-for-selfie, lift-to-check-phone, and double-tap to wake.
Fingerprint Scanner Notifications Control
Another feature most users won’t know about hides inside of the fingerprint scanner: the scanner doubles as a notifications toggle. Sliding one’s finger down over the scanner displays the notifications tray. A single swipe up dismisses the tray.
Here’s a video of how to enable the feature:
LG’s flagship smartphones can launch its camera by double-clicking on its power button. The Pixel borrows this feature. This can help you quickly get to your camera app so that you never miss a shot.
Like the Moto X, the Pixel can switch between cameras with ease. It works like this: after opening the Camera app, make a double-twist motion while holding the Pixel to launch the selfie camera. The Pixel then switches from the rear-facing camera to the front-facing camera.
Lift-to-Check-Phone and Double-Tap to Wake
Ambient Display turns the screen on temporarily to show notifications. On an AMOLED panel, this saves battery life and helps with screen burn-in.
Newer Android devices usually include Ambient Display by default. However, the Pixel adds two additional features. Lift-to-check-phone allows the Pixel to switch its screen on whenever it senses that someone has picked the phone up. Double-tap to wake spares the physical power button by waking the device by double-tapping on the screen.
Here’s a video demonstrating the new features:
Google Daydream View
Another interesting feature is its virtual reality (VR) platform: Daydream View. It’s a mostly-cloth headset with a remote that allows you to get lost in a virtual world using the Daydream View app. Here’s a breakdown and explanation of the Daydream View in action:
“Improved” Aesthetic Design
To distinguish between conventional Android devices and a Pixel, Google added some distinct visual flourishes.
The Pixel’s AI-driven software can choose wallpaper designs for its user, along with custom system sounds. Here’s a video demonstrating the wallpaper picker in action:
Early leaks indicated that the Pixel was finally adding a dark theme. Unfortunately, these rumors turned out to be bunk. What we got was ridiculously underwhelming. Google turned the System User Interface blue, instead of cyan. Here’s a comparison between what you get in stock Android and what you get in a Pixel:
Unfortunately, blue AMOLED subpixels degenerate at a faster rate than other colors. I don’t know why Google decided to use blue icons for the settings menu. Even so, the differences are minimal at best.
Why Did Google Put Special Features in the Pixel?
No one knows for sure. It seems that Google wants to move to where the real money lies: flagship smartphones. And no company makes more money per smartphone than Apple. Unfortunately, Apple’s model revolves around proprietary standards. Unlike Apple, Google doesn’t sell 19 dongles.
However, it has moved its Night Light technology out of Android’s open source ecosystem. Given this baby-step away from open standards, should you trust Google by putting its learning machine in your pocket?
Have you discovered any hidden features in the Pixel? If not, what exclusive features have you hooked? Please let us know in the comments!
Image Credit: Oleg1969 via Shutterstock.com | Cryteria via Wikimedia Commons