Adobe’s New Computational iPhone Camera App Looks Incredible
Alongside new Photoshop and Lightroom updates, Adobe recently unveiled Project Indigo, a new computational photography camera app.
As spotted by DPReview, Adobe quietly introduced Project Indigo last week with a blog post on its dedicated research website. Project Indigo is a new computational photography camera app that promises to leverage the significant advancements in computational photography made over the last decade-plus to help mobile photographers capture better-quality images.
In their article, Adobe’s Marc Levoy (Adobe Fellow) and Florian Kainz (Senior Scientist) explain how, despite giant leaps forward in smartphone image sensor and optical technologies, many hardcore photographers still lament phone photos because they cannot compete with the much larger sensors and lenses in dedicated cameras, embody a “smartphone look” that is overly processed, and generally don’t provide dedicated photographers with all the manual controls they desire.
Enter Project Indigo. The new app aims to “address some of these gaps” that photographers experience with their smartphone cameras. Project Indigo, which is available right now on the Apple App Store, offers full manual controls, a more natural “SLR-like” look to photos, and the best image quality that current computational photography technology can deliver, including in JPEG and RAW photo formats. Levoy and Kainz add that their new app also introduces “some new photographic experiences not available in other camera apps.”
Much Better Low-Light Mobile Photography
Starting with the computational photography aspects of Project Indigo, Adobe researchers utilize intelligent processing to enhance smartphone image quality significantly. In the examples below of a low-light scene, the first photo was captured as a single image on an iPhone under 1/10 lux illumination. The second image is a handheld shot captured by Project Indigo. The app captured and merged 32 frames in quick succession, which means that each frame can push the sensor less, resulting in less noise while still achieving an appropriate final exposure. It’s effectively like keeping the shutter open for much longer without having to hold the camera steady for a longer period. And yes, this also works with RAW photo output.


“What’s different about computational photography using Indigo? First, we under-expose more strongly than most cameras,” the researchers write. “Second, we capture, align, and combine more frames when producing each photo — up to 32 frames as in the example above. This means that our photos have fewer blown-out highlights and less noise in the shadows. Taking a photo with our app may require slightly more patience after pressing the shutter button than you’re used to, but after a few seconds you’ll be rewarded with a better picture.”
Project Indigo Promises a Natural Look to Your Photos
As for the “look” of smartphone photos that many photographers dislike — something that some apps have worked hard to overcome — many mobile camera apps use computational photography to excess. While high-dynamic range photos with clever tone mapping can expand the dynamic range that mobile shooters can capture, they can also result in distorted, unnatural-looking images.
Adobe has already made considerable strides in the realm of more natural-looking HDR photos with its impressive Adaptive Color profile, which uses subtle semantically-aware mask-based adjustments to expand tonal range without making photos look weird. Project Indigo builds upon this work but can achieve even better results because, as a camera app itself, it can work alongside the specific camera settings in real-time.
“Our look is similar to Adobe’s Adaptive Color profile, making our camera app naturally compatible with Adobe Camera Raw and Lightroom. That said, we know which camera, exposure time, and ISO was used when capturing your photo, so our look can be more faithful than Adobe’s profile to the brightness of the scene before you,” Adobe writes.
While Adobe has provided many Project Indigo sample photos that can be properly displayed on this article, many more are best viewed in HDR. To view these photos, visit Adobe’s Project Indigo Lightroom album. Adobe recommends viewing this album on an HDR-compatible display using Google Chrome, but it will work in some other browsers. Adobe notes that the album may not display correctly in Safari, however.


Using Natural Camera Shake to Capture Sharper High-Zoom Photos
Many modern smartphones, like the iPhone 15/16 Pro and Pro Max models, feature several high-quality rear cameras with different focal lengths (fields of view). Across various focal length options, however, phones utilize digital crops, meaning they just use less of the image sensor and then, in some cases, digitally scale the images to be larger.
In Project Indigo, when the user pinches to zoom, the app uses multi-frame super-resolution, which promises to “restore much of the image quality lost by digital scaling.” It works similarly to a pixel-shift mode on a dedicated camera, taking advantage of natural hand movement to capture the same scene from a series of slightly different perspectives. The app then combines these different frames into one larger, sharper one that features more detail than a single photo. And unlike AI-based super-resolution, the extra detail is real — pulled from actual images.


Professional Camera Controls
Project Indigo’s third key objective is to offer mobile photographers with the professional controls they get on their dedicated camera systems. Project Indigo includes pro controls built “from the ground up for a natively computational camera,” including controls over focus, shutter speed, ISO, exposure compensation, and white balance. However, since Project Indigo relies heavily on burst photography for some of its features, it also includes fine-grained control over the number of frames per burst.



It also includes a “Long Exposure” button that replaces the app’s merging method to capture photos with the same dreamy, smooth appearance as a long-exposure shot on a dedicated camera. This is great for taking pictures of moving water, for example, and can also be used for creative lighting effects and traditional single-frame night photography.
More Tech and What’s Next?
The complete Project Indigo article offers much more in-depth technical information, including the app’s image processing pipeline as it relates to photographic formats, demosaicing, and real-time image editing. It’s an excellent read for photo technology enthusiasts.
Project Indigo will be continually updated and may serve as a testbed for Adobe technologies in development for other apps.
“This is the beginning of a journey for Adobe — towards an integrated mobile camera and editing experience that takes advantage of the latest advances in computational photography and AI,” Adobe writes. “Our hope is that Indigo will appeal to casual mobile photographers who want a natural SLR-like look for their photos, including when viewed on large screens; to advanced photographers who want manual control and the highest possible image quality; and to anyone — casual or serious — who enjoys playing with new photographic experiences.”


Pricing and Availability
Project Indigo is available now for free on the Apple App Store. It works on all Pro and Pro Max iPhones starting from series 12 and on all non-Pro models from iPhone series 14 and newer. Since it is a work-in-progress, it does not require an Adobe account to use. An Android version and in-app presets are in development. The team also says it is working on solutions for exposure and focus bracketing, plus new multi-frame modes.
Users are encouraged to download Project Indigo and try it for themselves. Adobe wants their feedback, too.
Image credits: Project Indigo is an experimental camera app developed by Adobe’s Nextcam team. Contributors were Jiawen (Kevin) Chen, Zhoutong Zhang, Yuting Yang, Richard Case, Shumian Xin, Ke Wang, Eric Kee, Adam Pikielny, Ilya Chugunov, Cecilia Zhang, Zhihao Xia, Louise Huang, Lars Jebe, Haiting Lin, Lantao Yu, Florian Kainz, Mohammad Haque, Boris Ajdin, and Marc Levoy. The photographs in this blog are by Marc Levoy, Florian Kainz, Sophia Kainz, Adam Pikielny, and Lars Jebe.