iPhone Infrared: Why One Old Phone Sees What the New Ones Don’t

I shot these iPhone infrared photos on the Cape a couple of summers ago, and to this day they still stop me in my tracks. Not because they’re technically perfect, not because they’re groundbreaking, but because they feel different. They feel like they’re seeing something just outside the visible world, something we sense more than we see. What still baffles me—and I mean genuinely scratches at the back of my brain—is that the device responsible for those images, the iPhone 15 Pro Max, consistently outperforms both my newer iPhone 16 Pro Max and iPhone 17 Pro Max when it comes to infrared photography. Same infrared filter. Same adapter. Same workflow. Same photographer. Different results. Noticeably different results.

The 15 Pro Max files are sharper, cleaner, more defined edge to edge. The tonal separation feels richer. There’s a crispness to foliage, a glow in the highlights, and a subtlety in the shadows that just holds together better. The newer phones? Still good, still usable, but something’s off. Softer edges. Slightly muddier transitions. A kind of computational hesitation, as if the camera is second-guessing what it’s seeing. And that’s the part that fascinates me. Because in theory, the newer hardware should win. Better sensors, smarter processing, more advanced pipelines. But infrared isn’t theory. Infrared is a bit of a hack. It lives in the margins of what the camera was designed to do, and sometimes those margins shift in ways we don’t fully understand.

My gut tells me it has something to do with how each generation handles light filtration at the sensor level. Apple is constantly refining how much infrared contamination gets blocked in normal photography, because for everyday shooting, infrared is noise. It messes with color accuracy, skin tones, white balance. So they engineer against it. But for those of us intentionally trying to capture infrared, those same improvements can work against us. The 15 Pro Max may simply allow a bit more IR light to slip through, or process it with less aggressive correction. The newer phones, in their quest for perfection, might actually be overcorrecting, stripping away some of the very signal we’re trying to capture. That’s speculation, of course. I don’t have access to Apple’s internal design notes. But after thousands of frames across multiple devices, the pattern is hard to ignore.

What I love about iPhone infrared, beyond the quirks and inconsistencies, is how simple the process remains. This isn’t some high-barrier, gear-heavy niche that requires a dedicated converted camera and a bag full of lenses. You buy an infrared filter, you attach it to your phone with a light-tight adapter—emphasis on light-tight, because any leak will ruin the shot—and you go out and shoot. That’s it. No menus buried three levels deep, no custom firmware, no ritual. Just point, tap, shoot. And then, back in the Photos app, you convert to black and white, maybe tweak the exposure, dial in contrast, nudge the highlights, lift or crush the shadows depending on your taste, and you’re done. It’s almost offensively simple for something that produces such otherworldly results.

The Cape, in particular, is made for this kind of work. All that summer foliage, the grasses, the scrub pines, the hydrangeas, they light up in infrared like they’ve been dusted with snow. Skies go dark, almost black, especially if you’re shooting under strong midday sun, and clouds pop with this dramatic, high-contrast presence. Water takes on a glassy, sometimes inky quality. And suddenly a place you’ve seen a thousand times, a place you might even take for granted, becomes unfamiliar again. That’s the real magic. Not the filter, not the phone, not even the technique, but the shift in seeing. Infrared forces you to slow down and re-evaluate what’s in front of you, because the feedback loop between what your eyes see and what the camera records is broken in the best possible way.

There’s also something refreshing about embracing a part of the iPhone that Apple isn’t actively promoting. This isn’t Night mode, or Portrait mode, or some headline feature from a keynote. This is fringe use. This is pushing the device slightly off-label and seeing what happens. And in doing so, you’re reminded that these cameras, as advanced as they are, still have personality. They’re not perfectly neutral recording devices. They interpret, they prioritize, they suppress, they enhance. And sometimes, as I’ve found with the 15 Pro Max, an older interpretation can be more interesting, more pleasing, more aligned with what you’re trying to do than the latest and greatest.

So yes, I’ve continued to shoot infrared on the 16 and 17 Pro Max. I’m not abandoning them. But when I really want that look, that particular rendering that first hooked me, I keep coming back to the 15. It’s a reminder that progress in technology isn’t always linear from a creative standpoint. Newer doesn’t automatically mean better for every use case. Sometimes it just means different. And sometimes different is enough to change the entire feel of an image.

If you’ve never tried iPhone infrared, do yourself a favor and give it a shot. Not because it’s trendy or because it will make your work stand out on social, but because it will shake up your seeing. It will pull you out of your habits, out of your default way of framing and exposing and interpreting the world. And at the end of the day, that’s what this whole thing is about. Not chasing specs, not chasing perfection, but chasing that moment where you look at a photo and think, “I’ve never seen it quite like that before.”

Clic.

Jack.

Share:
Jack Hollingsworth
Photographer
How to Create iPhone Photos that don’t suck

Get exclusive guides and resources. Drop your email to show you the tricks for free.