Here’s what goes on behind-the-scenes with Motion Photos on the Pixel 2

By | 14th March 2018

More than meets the eye.

The Pixel 2's camera continues to be in a league of its own, and not a day goes by where it fails to impress me. I still haven't messed around too much with its Motion Photos feature, but after reading through Google's behind-the-scenes look at the technology used to pull it off, that may begin to change.

When Motion Photos was announced, I personally just saw it as a way of Google playing catch-up with Apple's "Live Photos" on iOS. Capturing a couple extra seconds of footage along with a still image is a neat idea, but Google's actually doing a lot more than simply recording a scene prior to hitting the shutter button.

With Motion Photos enabled on the Pixel 2, taking a picture also records motion metadata that's created using the Pixel 2's gyroscope and optical image stabilization system within its camera. These two components are combined through the use of software to create Motion Photos, and by using a combination of hardware and software-based stabilization, Google can greatly reduce the amount of camera shake found within these short clips.

Before (left) and after (right) Motion Photos' stabilization

Per Google's Research Blog:

For motion photos on Pixel 2 we improved this classification by using the motion metadata derived from the gyroscope and the OIS. This accurately captures the camera motion with respect to the scene at infinity, which one can think of as the background in the distance. However, for pictures taken at closer range, parallax is introduced for scene elements at different depth layers, which is not accounted for by the gyroscope and OIS.

Once this system determines how much background movement there is in a Motion Photo:

We determine an optimally stable camera path to align the background using linear programming techniques outlined in our earlier posts. Further, we automatically trim the video to remove any accidental motion caused by putting the phone away. All of this processing happens on your phone and produces a small amount of metadata per frame that is used to render the stabilized video in real-time using a GPU shader when you tap the Motion button in Google Photos.

Before (left) and after (right) Motion Photos' stabilization

As you can see from the GIFs above, the end result of this process is pretty darn incredible – and all of it happens in the background using the power of software.

Motion Photos are turned on by default on the Pixel 2, and you can share them as video clips and high-resolution GIFs right within the Google Photos app.