I was fortunate to test the iPhone XS Max with the FiLMiC Pro team in Barcelona. Here’s a iPhone XS Max low light video test and my initial thoughts.

The iPhone XS Max, iPhone XS and iPhone XR are truly fascinating devices. Featuring intelligent and dynamic digital signal processing, the latest generation iPhones can generate images that appear to be beyond the ability of the optics and sensor alone. Of course, that’s exactly the beauty and promise of computational imaging.

I don’t have any information about the camera or image processing beyond what is public. What follows are my own assumptions based on the behavior of the camera and observation of the recorded images.

If any Apple engineer happens to read this, and wants to reach out personally, really, you’ve got my head spinning trying to figure out what kind of voodoo you’re pulling off in this phone.

Of course, I know that conversation is not going to happen, commercial secrets are secrets for a reason.

Early iPhone XS Max Low Light Video

On September 21st 2018, I arrived in Barcelona to join the FiLMiC team for some testing and brainstorming. The iPhone XS Max hit the shelves in Apple stores the same day. Of course, the FiLMiC team had pre-ordered one. For the next few days we tested the iPhone XS Max around Barcelona.

I shot the below video handheld on my last night in Barcelona with the FiLMiC team. I hadn’t planned to shoot anything that night, only to enjoy some drinks and good food with friends.

Of course, when I picked up the iPhone XS Max we had all been testing the previous days, I couldn’t put it down. So, I shot what was happening around me.

There are two versions in this video. The first version is color graded using FilmConvert in DaVinci Resolve with some of my own adjustments. In retrospect, I feel the look is a bit heavy handed, and there are shots that aren’t correctly matched. The ungraded version follows, which shows the original uncorrected video clips as recorded with FiLMiC Pro. This may actually be of more interest and value.

More Than Meets The Eye

While this may have “just” been a “S” year, the significance and impact of Apple’s clear direction towards sophisticated real time computational image processing should not be underestimated. Apple clearly believe that software is the future of the most popular camera in the world. I couldn’t agree more.

With the iPhone XS Max I was able to capture clean video in very low light conditions. Furthermore, it had more color information in the shadows than any previous iPhone I have shot with.

Here’s what I think is going on.

Dynamic Tone Mapping

Judging from the behavior of the camera in bright and normal lighting conditions, it appears that the luminance values of the recorded image are not determined by global gain (ISO), or a fixed gamma transform.

Something else is at play, and it is dynamic. I believe that luminance values are determined according to a combination of variables linked to a real time analysis of the scene. Furthermore, this algorithm may be making localized adjustments to different parts of the image. This would be extremely impressive if true. It may also be manipulating color and saturation dynamically as well.

I am not sure how sophisticated this algorithm is. I can only observe the results, and infer the likely underlying mechanics.

Noise Reduction

It goes without saying that Apple have implemented noise reduction. Noise reduction usually results in some very obvious artifacts. However, the expected artifacts are not obvious. I need to spend more time pixel peeping images captured in more varied conditions.

Whatever combination of spatial and temporal noise reduction is at work, it’s very good, and probably applied quite early in the signal chain. As with dynamic tone mapping, the application of noise reduction may be localized.

Apple have incorporated a larger sensor with larger photosites. I speculate there could also be architectural innovations at the chip level that help make all of this possible.

Better Imaging for Consumers. Challenges for Professionals

As an active and vocal proponent of the kind of computational imaging technology Apple has clearly employed, the way they have employed it has brought some consequences I didn’t expect.

I expected the target output of any image processing chain (augmented computationally or otherwise) to maintain a relationship between real-world scene values and recorded image values. This has been the target of all photographic technologies and methods since the first latent images were captured.

Instead, Apple have prioritized automatically generating an output that is likely to look good to the average viewer. This comes at the expense of the relationship of recorded image values to real-world scene values.

The consequence is that conventional post production processes which depend on consistent, accurate, scene referred source image information can no longer be applied in the same way.

As the iPhone is a consumer device, of course this makes sense. Most users are not technical, do not take their video into post for color correction and want to achieve the best looking result automatically from the camera. With the type of dynamic tone mapping I am observing, it’s obvious this is exactly what Apple have prioritized.

The Next Steps

I will share more insights about the iPhone XS Max, iPhone XS and iPhone XR for filmmakers as testing continues. A color managed workflow is required to neutralize the inconsistencies introduced by Apple’s dynamic tone mapping. You’ll find more information right here on my website, and on my YouTube channel.

iPhone XS Max Low Light Video Still Frames

Check out a few color graded still frames. A few of these shots didn’t make it into the video for various reasons but work well as stills.

9 Comments

  1. Pingback:Filmmaker says iPhone XS sensor and image processing makes low-light video ‘voodoo’ good – TheTechFreaks

  2. Pingback:„Beeindruckende Videos bei schwachem Licht“: Filmemacher lobt iPhone XS Max | iTopnews

  3. Pingback:La fotocamera di iPhone XS e XS Max rende i video in low-light incredibilmente belli [Video] | iSpazio

  4. Pingback:iPhone XS digital camera assessments clarify Beautygate selfies and low-light voodoo | Tech News

  5. Richard, which of the two versions demonstrated looks more like what you were actually seeing with your own eyes? That kind of additional information would be most helpful.

    • Hi Jeff, definitely the ungraded version is closer although all the shots needed some technical correction to balance by the scopes before any kind of stylistic grade. I’m now regretting the grade a bit. I should have just gone with a basic color correction, matched everything and left it at that.

  6. Alexander Lüthi

    Great post! What profile were you shooting in? Did you have the noise reduction turned on in Filmic pro? I get way more noise when recording in filmic pro on my xs…

    • Hi Alex, this was in FiLMiC flat profile, no noise reduction turned on in FiLMiC. I kept noise to a minimum just by making sure I wasn’t underexposing in the first place. In low light I always want to protect highlights, so I make sure bright things like street lights, signs etc are not clipping, and I just let the mid tones and shadows fall where they fall. So for street scenes, this usually means I end up around 1/24th or 1/48th sec shutter speed at minimum ISO. This gives minimal noise, and my bright light sources are not over-exposed, retaining detail in and around the lights, however the trade-off is mid-tones and shadows usually get very dark. To be honest, I leave them darker than some people would, and I get some criticism for it, but it’s the way I do low-light. I’d rather work with the bright spots, look for areas where bright specular light sources reflect in surfaces, or windows, or throw pools of light on walls or pavements, and let the rest just be black. For me, the style I like, that approach is exposed just fine, but for others, it would be considered too dark, they want more shadow and mid-tone detail, but that means increasing ISO, which means noise, plus over-exposing bright light sources, which screams “phone video”. It’s always going to be a compromise, until we get much higher dynamic range out of phone cameras. A professional approach to these situations, with any camera that can’t see in the dark, is to light night time exteriors, you bring in big light sources, diffuse them over a large area and raise the ambient light level. This decreases the lighting ratio, or difference in brightness between shadows in the scene, and brightest areas, such as interior lit windows, signage, street lights etc. However, most of us don’t have access to a truck full of HMI lights to light a night time intersection like a Hollywood film set.

      • I see. Im pretty sure I was shooting in flat as well, with ISO under 50, but i still got quite some noise in the shadows when shooting outside, daytime, cloudy… Maybe I was shooting LOG. Well well…
        I totally agree with you regarding sacrificing some details in the dark and mids, in exchange for low noise and a more filmic look.
        I think you have a very interesting point which is worth some more discussing – the ting you mention about computational photography. I too feel like the software is fighting against me when trying to tweak the settings to my preferences, which makes me wonder if that’s not a totally bad thing to let loose of the manual controls. Sure, I would like to be able to shoot in a flat profile to make the colour grading easier, but I can live without it if I get better exposed footage. What I can’t live without, though, is the ability to change the shutter speed. Too fast a shutter speed screams “phone video” just as much as blown out highlights and noisy shadows, I think. The problem is that the stock camera app doesn’t let me change the shutter speed, so I need to go to third party apps (filmic pro and Moment is my favourites). But then Im afraid I don’t get all the photo-magic enabled by the A12 chip, like Smart HDR in 24 and 30fps… Is the computational part of the photograph/video made “inside” the sensor, or in the camera app? This leves me wondering when to use which app…

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.