There is a word called Japanese, Komorebiwhich represents the rays and dappled shadows created when the sun shines through the trees. As I walked my dog through a leafy neighborhood in Washington, DC, Komorebi It is the one that most often catches my eye, especially in these autumn moments when the dark green summer leaves begin to thin and turn golden yellow. As the sun sets and shadows stretch over the edges of the steep valley near my apartment, the leaves create a waving pattern of warm and cool colors. I try to capture these apparitions with my iPhone’s camera, but I’m always disappointed with the results. The device’s automatic image processing treats contrast as a problem to be solved, aggressively darkening highlights and brightening shadows to achieve bland flatness. The gloomy atmosphere seen in real life hardly remains in the images.
That all changed recently when I downloaded a new camera app. Released in 2017, Halide is an elegant program that can be used to replace your phone’s default camera. It mimics the controls of a DSLR camera, allowing users to manually adjust the focal length, for example. Halide is a complex app suitable for experienced photographers (the name comes from the chemicals used in photographic film), but with the addition of a new setting in August called Process Zero, it’s much simpler. You can also When this mode is turned on, the camera does as little processing as possible, avoiding artificial intelligence optimizations and other drastic photo edits. (Basic tasks like white balance and correcting lens distortion will still be performed.) iPhone typically combines many separate images to create a single composite. With Halide, you get a single digital image that maintains the richness and contrast of what’s in front of you. The shadow survives. Highlights may be blown out, but the camera doesn’t show details that the eye doesn’t necessarily see, like clouds in a bright sky, like the iPhone does. While Apple’s automatic editing irreversibly smooths out the digital grain in dim images, Halide preserves the digital grain, resulting in a more textured image. Eschewing the incredible perfection that characterizes iPhone photography, Process Zero has made me enjoy taking photos with my phone again. Because it doesn’t feel like you’re constantly fighting algorithmic edits that you can’t control or predict. “We’re going to keep this as stupid as possible,” Halide co-creator Ben Sandovsky told me of the program’s ethos.
Process Zero has proven to be a hit with Halide, a product of Lux, a parent company that develops several niche high-performance camera apps. (The Halide app costs about $60, but you can pay monthly or annually.) Since the feature launched, Halide has been downloaded more than 200,000 times. The app has hundreds of thousands of monthly active users. “It ended up being incredibly explosive,” Sandovsky said. Alongside trends like the Danphone and Fujifilm X100 digital cameras, Process Zero’s popularity is another sign of the burgeoning demand for technology that resists AI’s active decision-making. These new tools satisfy our urge to fall back on technology that allows us to explore our preferences rather than imposing our own defaults.
Sandovsky, 42, who lives in Manhattan, always wanted to pursue visual art, but his father encouraged him to get a degree in computer science. After college, he eventually landed a job at Twitter in San Francisco in 2009, and a windfall from the company’s IPO in 2013 gave him the financial freedom to pursue his passion projects. Apple’s iPhone camera was getting better and better. They were “magical devices” that “push a button and you get a good picture,” Sandovsky said. In his words, its convenience posed an existential question to traditional photography. “Is there a place for a manual camera in this world?” He partnered with former Apple designer and amateur photographer Sebastian de With to build Halide, which helps users define what’s “good”. I have regained the ability to make decisions for myself. For most of photography’s two-century history, it was “never surreal,” Sandovsky said. At times, the image was unusually blurry or focused in unintended areas. They are colored by the chemical composition of different types of color films. In contrast, Apple’s camera appears to focus everything at once, saturating each color and exposing each side of the image evenly. “Modern computational photography is almost in this uncanny valley,” Sandovsky said. The overall quality of Apple’s images, for example, can be kind of confusing to viewers. If nothing else in the image is visually highlighted, we don’t know what to look at.
The advent of optimized digital images has led to the realization that the “most realistic” image may not be the most attractive one. I realized that I prefer photo-like photos. Composer Brian Eno writes the following in his book, The Year of the Swollen Appendix, and I remember it often. The distortion of CDs, the jitter of digital video, the crappy sound of 8-bit, all of these things will be cherished and emulated as soon as they can be avoided. ” iPhones can remove aesthetic flaws in digital photos, but what if those flaws were actually positive features?
“The camera sees more than the photographer,” writes photographer Benjamin Sweat in his new book of memoir-like essays, The Picture Not Taken. In other words, photos record moods and details that may not be obvious until after the fact. You’ll do your best to frame the shot, but you might be surprised by what it ends up containing. Accidental flaws can become the main outcome of a photograph unless those flaws are first erased. With Halide, your phone photos can be amazing again. This process reminds me of tinkering with medium format film cameras in high school, before cell phone photos were more than just dirty pixels. In college, I joined a photojournalism group. So, during a field trip with photojournalist Gary Knight, a lesson learned was that photos should be taken while looking through the viewfinder. With the digital SLR camera I was using during my trip, I was able to take over a dozen photos in an instant, but the important thing is to think carefully about the scene, composition, and camera settings before taking a photo. All the editing and enhancements in the world can’t fix a fundamentally bad or boring photo. The knowledge that Halide doesn’t gloss over imperfections forces me to slow down and think about the creative process just one beat longer. It makes me think more about what I’m seeing.
Apple’s iOS 18 lets you edit your phone’s lock screen to use Halide instead of the default camera app with a single tap. (The app logo appears in the bottom right instead of the regular camera, but Apple’s default is available with a swipe.) Currently, the Process Zero camera is the only one I use. The interface is very similar to the iPhone’s camera, with a large preview frame and a button to take a photo. The app doesn’t do much on the backend, but you can also go deeper with settings and edit photos after the fact if you want. Just because you took a photo without AI in the first place doesn’t mean you can’t apply it later.
The appearance of your phone’s camera roll has changed. There will be less repetition and slight variations in the same images taken in bursts. The composition is less static or symmetrical, and the colors are more funky. There is a halide snapshot of some leaves casting shadows on the tree trunk, which takes up most of the frame. Apple’s phone apps tend to have warmer colors, but this one looks very blue and cool, and the background behind the tree is dim rather than eye-scorchingly bright. But I prefer such photos. The visual quality of a particular scene still exists as something to be evaluated. It doesn’t look like anything else, and it shouldn’t. ♦