Apple’s announcement of Live Photos is a very exciting one for me. I have tried several methods for creating cinemagraphs; which are live photos with elements that move. Amateur photographers like me are always looking for new ways to make our images stand out. Cinemagraphs have recently become a popular way to draw attention to images in a newsletter, website banner, or alongside ads with ‘old school’ static images.

The technology isn’t as popular as it could be, as cinemagraphs are somewhat time consuming to create. First off, if you need a high quality final result, the embedded media is actually a video rather than an image. You can embed the end result as a GIF to save space and reduce page loading time. With a bit of Photoshop skill, you can create this type of movie or GIF from a short video clip with extensive layer masking and looping frames together. That’s pretty time consuming. I tried it with limited success and decided to look for another method.

In 2014, Apple gave Flixel a design award for Cinemagraph Pro. Flixel offers a full featured iOS and Mac app that makes it easy to create looping video cinemagraphs. I tried it, and found that final product can be saved as a high quality video or as a low quality animated GIF. The videos can be hosted on a cloud service, with the option of embedding an iframe, which plays the video clip in a loop without requiring the end user to press a Play button . Here’s an account of my time testing Cinemagraph Pro.

I went out in search of a good subject for a cinemagraph, and my local farmers market provided just the thing in the form of fresh roasted chiles. My process included setting up a video camera (a Sony a6000 in my case) on a steady tripod and taking enough frames to loop the video smoothly back on itself. This only took five minutes, but I did get some strange looks while I set up and took several short clips. Cinema graph Pro was fairly simple to learn and within four hours of installing it I had this clip looping over a set still frame in the background.

Image not working? Click here.

My chile roasting model was moving a bit more than I’d like, so several frames in the loop don’t match up perfectly. I’m sure with some practice I would love what I’m getting out of this software. However, it’s not cheap and neither is the hosting plan for your embedded content. Maybe I can find a more cost-effective way to distribute cinemagraphs, but I’m going to see what I can do with Live Photos first. Here’s another simple cinemagraph I made that represents what I think Apple Live Photos will look like.

Southlands Fire Pit

Image not working? Click here.

I captured this one without a tripod and instead relied on in-camera stabilization, just to see what would happen. The camera did a poor job and the excess motion was impossible to get rid of in processing, so there’s some bad artifacting around the flames. On this count I think an iPhone and Live Photos will do a good job of keeping the frames in sync and stabilized. For better or worse, Live Photos will make shooting clips in portrait mode even more commonplace; maybe even acceptable?

So what’s going to be possible? Excerpt from Apple developer library:

iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo.You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the users photo library by using PHImageManager or UIImagePickerController.

Most people will likely use these as GIFs or short video clips…at least initially. When developers get access to all of that data that’s packaged with a Live Photo, there is nothing stopping us from doing complex editing with that data. We should be able to freeze part of a frame while leaving the rest in motion, add looping and crossfading, and essentially use a quick Live Photo to mimic what full-featured desktop software needed four hours to pull off. At least that’s my hope, and we will know very soon if all of that is possible.

Here’s why I’m excited for Apple’s Live Photos. I love still photos because the viewer’s imagination transports them into the scene. As a photographer, if I want to tell my viewer more about the sound and mood of the place, the extra frames can make that happen. Every photographer knows that they need to master video to really tell their stories well. This in-between transition is a great step to speed the learning curve, and once again it’s Apple pushing a technology forward.

Apple has a history of taking something complex and making it look dead simple. Making it east for savvy photographers to plan and shoot Live Photos intentionally will make me reach for my iPhone more frequently as a photography tool. By planning which elements in the frame are still and which will move, I can create a Live Photo on iPhone 6S and 6S Plus with 1/100th of the effort that it takes to make a cinemagraph with a professional camera and editing software. With Apple’s new developer APIs to support Live Photos being used immediately by Facebook it won’t be long before Live Photos can be distributed and viewed anywhere on any device.

Bonus! Besides the ability to cut out hours of effort that I was planning to invest in creating Live Photos, I won’t be the weird guy setting up his tripod and camera at the farmer’s market.

Kevin Wenning

Project Manager

MartianCraft is a US-based mobile software development agency. For nearly two decades, we have been building world-class and award-winning mobile apps for all types of businesses. We would love to create a custom software solution that meets your specific needs. Let's get in touch.