Understanding the Computational Photography in Alice Camera

The Alice camera is targeting a different type of creative and camera user.

“By applying computational photography to the Micro Four Thirds system we want to give people image quality more like that of a full-frame camera, at the price, size, and weight of an MFT camera…” states Liam Donovan, CTO of Alice Camera in an interview with us. In my 12 years of journalism, I can say that’s a very big claim. For the future of the camera market, I’m hoping Alice Camera can hold themselves true to it. Alice Camera is being funded on IndieGogo and promises to be completely different. 

You talk about how the AI chip will help with things like color, focusing, and more. Can we assume that the focusing is involving things like faces and animals? Or is it more advanced where it can just track moving subjects in different ways? Can you also talk a bit more about color, please?

Liam: Autofocusing is a complicated process that consists of two stages, the first is about deciding which part of the frame to focus on, and the second is about deciding how far to move the lens and in which direction. Most modern mirrorless cameras use relatively primitive AI to do the first of these by identifying certain features in the frame, usually human eyes, and then use either contrast or phase detection techniques to do the second.

With Alice, we are using end-to-end trained AI techniques to do both parts of autofocusing. Our algorithms learn from very large numbers of correctly focused images how to find which part of the image should be in focus, and how to adjust the lens to bring that part of the image into focus. We don’t program it explicitly to focus on eyes or faces, we train it to do so when appropriate by showing it lots of images where the eyes are in focus. With these techniques, the camera is able to learn how to focus a little bit more like the way a human learns how to focus, by looking at the image directly and comparing it with prior knowledge and experience of what a focussed image should look like.

With color, we are using AI techniques to replicate the transformations performed by a skilled human retoucher automatically in real-time on the camera. These algorithms learn how to make local, context-aware adjustments to the image from large numbers of before/after pairs of processed images, and produce the sort of effects that have previously only been possible with skilled use of software like Lightroom.

For photographers, you can think of it like automatic color-aware dodging and burning. For filmmakers, you can think of it almost like a locally adaptive LUT applied to the raw video on-camera before compression. While these algorithms are very powerful we are deliberately limiting them from being able to fabricate details or manipulate reality as some AI techniques can, and we are making sure all our algorithms follow our four guidelines, to be natural, conservative, expressive, and controllable.

 

Can you talk to us about the build quality of Alice Camera? Is there weather resistance built-in?

Liam: The camera is made from CNC-machined aluminum and feels very solid in the hand. While it will not be fully weather-sealed, it will have some protection from water and weather damage.

 

Why did you choose Micro Four Thirds? Is the system built more for video or stills?

Liam: Micro Four Thirds sits at something of a sweet spot for what we are trying to do. MFT sensors are much larger than those found in smartphones and produce excellent image quality, certainly enough to be considered professional, however, they are much smaller and cheaper than high-end full-frame sensors. Crucially, MFT lenses are a lot smaller and lighter, and there is a huge variety of them available. By applying computational photography to the Micro Four Thirds system we want to give people image quality more like that of a full-frame camera, at the price, size, and weight of an MFT camera, and with the modern experience and connectivity of a smartphone.

While the sensor’s specs do lean towards video, it is a hybrid camera capable of excellent SOOC stills perfectly suited for sharing on Instagram. In fact, I would say that Alice is a camera built for social media, and particularly for those who make a living on social media, rather than a dedicated video or stills camera.

 

Can you talk to us about the sensor technology, please? What can we expect? What will it be able to do in terms of dynamic range and video recording specs? Why the choice of an 11MP Sony sensor?

Liam: The sensor we are using is very similar to those used in the Panasonic GH5s and the Black Magic Pocket 4K. While it is 2-3 years old it uses technology that is still considered advanced including back-side illumination, dual-native ISO, and quad-Bayer HDR. The sensor excels in scenes with low-light and high dynamic range which are exactly the kinds of scenes that can most benefit from computational photography. The low resolution was chosen deliberately, partly to improve 4K video quality, but also to reduce the cost of our computational photography pipeline which scales directly with resolution.

 

Why no hot-shoe on Alice Camera? What if I wanted to use it with flashes?

Liam: Because the sensor we are using excels in low-light and our computational photography pipeline significantly improves low-light performance, many of the use-cases for flash are no longer present, and we don’t believe there is a huge overlap between users of dedicated flash units and our target audience. We opted to save space, reduce complexity and streamline the aesthetics of the top plate of the camera by building the cold-shoe directly into the CNC’ed aluminum casing.

 

Will Alice Camera be able to do things like multiple exposures, timelapse, etc? Or maybe a third-party app developer program?

Liam: The camera hardware is more than capable of effects like multiple exposures and time-lapses, the bottleneck currently is with the software. That is one big reason that we are committing to open-source as much of the code running on the camera as we can, to enable and encourage our community to customize and modify their own cameras and share their modifications with others. It is inevitable that we as a camera company will not be able to satisfy everyone’s desires for features and special effects, but by giving people the tools to make changes and additions themselves we hope the camera will be able to grow beyond what we are able to do with it ourselves. Third-party app development is another exciting possibility to achieve this, and it is something we are actively working on.

 

Will Alice Camera be able to work with apps on your phone at all?

Liam: You will be able to save photos and videos taken with Alice directly to your phone’s gallery through our app, and you will then be able to use any other apps you have on your phone to post-process and share them. The interface between the camera and the app will be open and so app developers will be able to integrate Alice into their apps more closely if they wish, but that is of course up to them! It’s worth mentioning that you will be able to use Alice as a USB webcam for video conferencing straight out of the box.

Chris Gampat

Chris Gampat is the Editor in Chief, Founder, and Publisher of the Phoblographer. He also likes pizza.