More from Prolost
The new Kino app recording ProRes Log with a custom preview LUT. Yes we’re still talking about shooting video on iPhones. But I also want to talk about digital cinema shooting in general, in a world where top camera makers are battling to give filmmakers everything we want in a small, affordable package. How does the DV Rebel spirit — born of camcorders and skateboard dollies — live on in a time of purpose-built digital cinema cameras that fit in your hand? For me, it’s meant keeping my rigs small and manageable. I love gimbals and drones and lidar focus rigs, but I’m happiest when my whole rig — tripod and all — can be picked up and moved with one hand. We suffered through so many years of too little camera, but now it’s quite easy to have way too much camera. I want to live in that sweet spot where image quality is not compromised, but I can still move fast. Turns out the same is true of camera apps. Photo by Karen Lum The Goldilocks of iPhone ProRes Log The built-in iPhone camera app is too little camera for shooting ProRes Log. There’s no preview LUTs, no manual adjustments, and the viewfinder is obscured by the controls. The Blackmagic Camera app is a truly wonderful gift to the Apple Log shooter, addressing all these issues. The only feature it lacks is the ability to pick it up in a rush and quickly make great-looking video. So, confession time: I almost never use it for anything other than the most controlled studio shooting. My Peru and Taiwan travel clips? Almost entirely shot with the native camera app. There’s a massive gulf between the too-little Apple app and the too-much Blackmagic offering. And my friends at Lux, makers of the wonderful Halide still photo app for iPhone, have created a new video app that lives squarely in this sweet spot. Kino Choose from built-in LUTs, or load your own. Bake them in, or just preview through them as you record uncorrected log. Kino is an app you can pop open and start shooting with right away, with just enough control to maximize the final quality of your iPhone video. It’s fast, fun, and a joy to use thanks to its thoughtful and beautiful design. Out of the (skeuomorphic) box, Kino is basically an app that shoots better-looking video on your recent iPhone. You can choose from professionally-designed LUTs to dial in the look you want. On iPhone 15 Pro and Pro Max, Kino becomes a log-shooting powerhouse. You can choose whether to bake the LUT into your footage or not, and you can add your own LUTs. Maybe my favorite feature is Auto Motion. While Apple’s camera app prefers fast shutter angles, Kino tries to keep you as close to a cinematic 180º shutter speed as possible. Kino runs the spectrum from consumer app that just makes iPhone video look better to professional control and options. It’s the perfect DV Rebel video app. Prolost Brand LUTs Kino features LUTs created by filmmakers you’ll recognize, but the honor of supplying the most boring LUT fell to me. The Neutral LUT is none other than my Prolost “TECH” LUT that I explained here. Point and Shoot Experience, Pro Results Accessibility is core to the DV Rebel ethos, and along with that came a focus on tuning not-quite-professional gear to achieve cinematic results. Years ago, this was essential for the indie filmmaker, as professional gear was truly unattainable. But even when professional tools are abundant and affordable (seriously, what a time to be a filmmaker!), sometimes the right camera for the job is the one that feels great to shoot with. And the same is true for camera apps. SHOTWITHKINO.COM
Still from Apple’s “Let Loose” video. Apple unveiled their new line of iPads yesterday in a pre-recorded video titled “Let Loose.” As with the previous “Scary Fast” MacBook Pro launch video, “Let Loose” ends with a tag proclaiming “Shot on iPhone” — this time adding “Edited on Mac and iPad,” and the fine print: “All presenters, locations, and aerial footage shot on iPhone.” During the live stream I actively wondered if the iPhone acquisition of “Scary Fast” had been a one-time thing. “Let Loose” looks great, as all Apple videos do, but some shots featured a shallower depth-of-field than is possible with an iPhone-sized lens and sensor combo. At the end of the event, I wondered publicly on Threads about this. Replies speculated about additional lens rigs, some improved version of Cinematic Mode, or maybe blurring the background in post. Panavision Lens Relay System After Apple released a behind-the-scenes video about the production of “Scary Fast,” the Internet did its internet thing and questioned the “Shot on iPhone” claim, as if “Shot on iPhone” inherently means “shot with zero other gear besides an iPhone.” These takes were dumb and bad and some even included assertions that Apple added additional lensing to the phones, which they did not. But for “Let Loose,” they did. “Let Loose” was shot on iPhone 15 Pro Max, and for several shots where a shallow depth-of-field was desired, Panavision lenses were attached to the iPhones using a Panavision-developed mount called the “Lens Relay System.” This rig is publicly available for rent from Panavision today, although not currently listed on their website. There’s just enough shallow focus in these shots to make me wonder how they could be shot on iPhone Cinematic Mode could never. What’s a “lens relay system?” Think of a telescope. Instead of focusing an image on a plane of film or a sensor, it creates what’s known as an “aerial image” that you capture with another lens system — your eye. If you’ve ever smashed your phone up to a pair of binoculars successfully, you’ve made a lens relay system. I used a Frasier lens relay system from Panavision to get small lenses into tight spaces for a Ruby Tuesday commercial I directed. The Frasier is a pericope lens, one physical unit that contains both the taking lens and the capture lens. With Panavision’s new system, the iPhone’s own lens captures the areal image created by any Panavision lens you like. The iPhone provides the image capture, in ProRes Apple Log, of course. There are, of course, many systems for mounting lenses to iPhone (Beastgrip is working on a new one), but it’s certainly notable that Panavision made one, as they exclusively serve a market of serious professional filmmakers. Why Lenses? If “Scary Fast” could be shot without add-on lenses, what does Panavision’s rig bring to the table? Apple Log allows the iPhone to capture highly-mailable, 10-bit ProRes footage that fits into a professional pipeline alongside footage from high-end cinema cameras, but the one thing it can’t do is capture the shallow depth-of-field that we associate with high-end productions. The look of “Let Loose” is a collaboration between the iPhone’s clean capture, the focus control of the Panavision lenses, and top-tier color grading. Don’t You Start Internet So is it fair to say “Let Loose” was “Shot on iPhone” if it was done with the help of gear the average iPhone owner could never afford? Of course it is. Feel free to re-read this: What Does and Doesn't Matter about Apple Shooting their October Event on iPhone 15 Pro Max — but in short, the fact that Apple can drop an iPhone into an otherwise-unaltered professional film production and match the look of their previous videos without anyone even noticing is meaningful. In fact, “Let Loose” is the first Apple Event finished and streamed in HDR, pushing the iPhone’s capture abilities even further than “Scary Fast.” You don’t need to add cinema lenses to the iPhone to make great-looking images, but the fact that you can is cool. You also don’t need to twist yourself into knots wondering why you might choose an iPhone over a professional cinema camera when you have a Panavision lens budget. Personally, I’m more excited about the run-and-gun possibilities — and a vote of confidence from the most elite cinema rental house only bolsters the story of iPhone as professional video camera. Or think of it this way: Apple confidently intercut footage shot with the most elite cinema lenses available with footage shot with unadorned iPhone lenses. Panavision for sure. But I;m betting no Panavision here. The Real Story Here is Apple Log None of this would be possible without Apple having adding Log ProRes recording to the iPhone 15 Pro. Log is to video what raw is to still photography, and the story of how Apple Log transforms the definition of “Shot on iPhone” from a dalliance to a responsible, even desirable strategy for filmmaking is still ongoing. Truly “pro” features like color-accurate OLED screens on iPads and ProRes Log on phones don’t just sell a few devices to a few filmmakers. They preserve and elevate Apple’s reputation as the choice of creative professionals in all fields. Apple hardware is vastly overpowered for most of its customer’s uses, so as Apple looks around for folks in need of their very best, they find the Zbrush artist, the Redshift renderer, and now, improbably-but-deservedly, the professional cinematographer.
This is a blog post about a video, which is about new color-conversion LUTs for Apple Log footage from the iPhone 15 Pro and Pro Max (updated from my first set). The video is also a mini-travelogue of my recent trips to Taiwan and Peru. This post dives a bit deeper into both the LUT workflows, and my state of mind about shooting digital-cinema-grade footage with a device I always have with me. There’s a lot going on here. Conflicted in Peru Me relaxing on vacation. Photo by Forest Key. I always have a moment when packing for a trip: Which camera to bring? Which lenses? I know I’m always happier when I pack less, like just a single prime lens. But sometimes FOMO gets me and I pack three zooms. For my trip to Lima, I brought my Sony a7RIV with the uninspiring-but-compact, Sony 35mm F2.8 prime. I lugged it around for a few days, but wasn’t really feeling it. Meanwhile, my iPhone 15 Pro Max was calling to me with its ProRes Log video mode. “I’m 10-bit!” It would say. “Think of the fun you’ll have color grading me!” I told my phone to shut up, and proceeded to shoot very little with it — or my Sony. Like a squirrel in the middle of the street, drawn in two different directions at once, I creatively froze. Photography, for me, is made up of a lot of habits, and shooting iPhone video with aesthetic intent is just not yet baked into my travel muscle memory. Made in Taiwan A month later, I took a family trip to Taiwan, one of my favorite places in the world. I’d had some time to process my Peru deadlock, and decided to stop judging my own creative impulses, and let inspiration guide me in which camera I pulled out. I wound up shooting a lot of video. Me relaxing on vacation. Photo by Josh Locker. I loved shooting ProRes Log in Taiwan with the iPhone 15 Pro Max. I’d occasionally reach for Blackmagic Camera, but I often just used the default camera app. I stuck my phone (with its crumbling case) out of taxi sunroofs and skyscraper windows, held it above teeming crowds and shoved it between chain-link fences. Seeing the broad dynamic range I was capturing in scenarios from noontime sun to neon-lit nights got me excited about grading the footage later. It’s exactly the way I feel about shooting raw stills with my Sony, knowing that I’ll be able to go crazy on them in Lightroom. The photographing act is just half of the process. Step through the frames below to see how color transforms a single shot from the video above: Uncorrected Apple Log Straight out of the camera. I mean, phone. Look & LUT An overall correction applied under PL-HERO LUT. Local Corrections Various windowed corrections under the Look help guide the viewer’s focus and give the natural-light capture a cinematic feel. LUTs, Looks, and Magic Bullets There’s been a bit of a gold rush of people hawking creative LUTs that apply a particular “look” to iPhone Log footage. My day job is, in part, helping make color tools like Magic Bullet Looks, which can do so much more than any LUT. Creative LUTs are great, and by all means support the folks making them — but that’s not what my iPhone LUTs were or are. The Prolost iPhone LUTs convert Apple Log to various other color spaces, and support three kinds of workflow: Grade Under a Display Transform LUT Apple Log is a totally decent color space to work in, so color correcting Apple Log can be as simple as applying Magic Bullet Colorista and choosing one of my Monitor & Grade LUTs. That’s what you see me doing in the video above. Colorista (set to Log mode) does its work on the native Apple Log pixels, and the LUT converts the result to look nice on video. Many other systems work like this, including LumaFusion, which ships with Prolost Apple Log LUTs. The key is color correcting under the LUT. Bring Apple Log into an Existing Workflow Color work is often done in some intermediate color space. This is usually some kind of wide-gamut, log format, such as Davinci Wide Gamut/Intermediate, or one of the ACES log spaces. The Prolost ACES LUTs simply convert Apple Log to either ACEScc or ACEScct log, allowing you to grade your iPhone footage alongside any other professional camera, and output them all through the same pipeline. Shooting Through a LUT The Blackmagic Camera app allows you to load any LUTs you want and view through them without baking them into your footage. With my LUTs, you can shoot with the same LUTs you grade under later, for a truly professional (no joke!) workflow. The real stars of this update though are the FC LUTs. They add an informative False Color overlay to the Shoot/Grade LUTs, making sure you always nail your exposure. Watch the video to see them in action. I already can’t imagine shooting without them. These LUTs work well in Blackmagic Camera or even on an external HDMI monitor. Adjusting exposure with a variable ND filter until the 18% gray card lights up yellow, for perfect exposure. PL-HERO-FC LUT in Blackmagic Camera. Gathering Resolve I’ve never edited a whole actual thing in Resolve before. As if this video wasn’t enough work already (I shot the a-roll in mid-December), I decided to use it as a test case for creative editorial in DaVinci Resolve. It’s the ACES LUTs that allowed me to incorporate Magic Bullet Looks into my Resolve color workflow. Maxon just shipped a really nice update to Magic Bullet Looks, with simplified color management made possible by more and more apps we support doing darn fine color management at the timeline level. So in Resolve, I can use my LUT to convert Apple Log to ACEScc, and then apply Magic Bullet Looks, which can now be set to work in ACEScc with a single click. The new streamlined color options in Magic Bullet Looks. Choose Custom to get the full manual control. I can sneak additional Resolve corrector nodes in between those two for local corrections. Resolve is great at this, and Looks is great at creative look development, so this is a match made in heaven. A little face lift. Then, at the end, I use an ACES Transform node to convert to Rec. 709 video. Get to the chopper. An expert Resolve user could replace my LUTs with Resolve’s built-in Color Space Transform nodes, but the LUTs make this process easier and more reliable. Gear Inspires Every photographer knows the feeling of lusting after new gear. We know it so well that we remind ourselves constantly that “next camera syndrome” is debilitating, and that “most cameras are better than most photographers.” Gear is not the answer. Go shoot. There is, however, a counterpoint to these truths: As shooters, we take inspiration where we can get it. And sometimes a new technique, a new locale, or even, yes, a new bit of gear is what provides it. The key is to listen for that inspiration, and don’t judge it. Even if it’s coming from your phone. Jiufen village, Taiwan Get the Free LUTs
A still from Apple’s “Behind the scenes: An Apple Event shot on iPhone” video Apple Shot Their “Scary Fast” October Event Video on iPhones And We Had Feelings You’re somewhere on the spectrum of occasionally shooting video on your iPhone to a professional-ish video maker with some gear, and you see at the end of Apple’s October “Scary Fast” event announcing new Macs with M3 silicon that the entire event was “Shot on iPhone.” This makes you feel a certain way. Actual footage of me watching the event. Then Apple posts a behind-the-scenes video showing how this was done, revealing a rare and imposing glimpse into the scale and scope of their industry-leading launch videos. At the center of it all, instead of their customary Arri Alexa (a digital cinema camera costing $35–150K before you even add a lens, used to shoot everything from Avengers: Endgame to Barbie), was an off-the-shelf iPhone 15 Pro Max, gripped into truckloads of professional support gear. At this point, some folks felt differently about what was implied by “Shot on iPhone.” There have been bad takes on this, and good takes on those bad takes. Anyone who knows the tiniest bit about video production knows that the camera is a small, but important, but small, part of the overall production picture. “Shot on iPhone” doesn’t promise “and you can do it too” any more than Stanley Kubrick lighting Barry Lyndon with candlelight means anyone with candles can make Barry Lyndon. But when the camera is the least expensive piece of gear on the set after clothespins and coffee, it does feel strange. I’ve been on a lot of productions like this, having played an active role in the DV filmmaking revolution of the late ’90s-to-early-2000s. It was an odd feeling to scrounge for the adapter plates required to mount a $3,000 DV camcorder purchased at Circuit City to a Fisher dolly that literally has no purchase price. Apple, of course, has no burden of best-practices logic for their decision to shoot their “Scary Fast” event on iPhone — it’s a marketing ploy, a cool stunt, and a massive flex. A thing to do for its own sake. In the filmmaking community, it was the mic drop of the year. We greedily soaked up all the details in the behind-the-scenes video, and made a hundred tiny calculations about which aspects of this lavish production actually mattered to the question of the iPhone 15’s validity as a professional camera, and which did not. With all that gear and production support, which aspects of the event really matter to you, the iPhone-curious filmmaker? What can you learn, and which aspects can you safely ignore? Let’s take it one at a time: That They Did It: Does Matter As camera features have played a larger and larger role in Apple’s marketing for new iPhones over the years, you might have begun to feel a bit of cognitive dissonance. Apple tells you about how great, and even “pro,” these new iPhone cameras are — but would never have dreamt of using them to shoot their own videos or product stills. Apple was effectively saying “pro enough for you, but not for us.” Valid, but a bit dissatisfying. A still frame from Apple’s October 30 “Scary Fast” event video, shot on iPhone 15 Pro Max. Apple has set the aesthetic bar impossibly high with these pre-recorded events. They’re not just executives teleprompting in front of Keynote slides — they feature “slice of life” moments shot on huge sets and real locations. Elaborate visual effects transition between locations and settings that might be partially virtual. These videos have looked great ever since Covid pushed Apple to find an alternative to executives-on-stage-in-front-of-slides, and even as Apple is now once again able to welcome guests to in-person product launches, these lavishly-produced videos are the new gold standard in pitching the world on a new iThing. With “Scary Fast,” Apple repeated their now well-established high-production-value playbook, but yoinked out the professional cameras and lenses, and dropped in a commodity consumer telephone in their place. And crucially, none of us noticed. It’s a big deal. They Shot ProRes Log: Matters So Much It Would Be Impossible Without It There is one single feature of the iPhone 15 Pro that made this stunt possible: Log. As I detailed here in words and video, the “iPhone video look” is designed to win consumer popularity contests, not mimic Apple’s own marketing videos, nor plug into professional workflows. It may be hard to imagine that a slightly different bit of signal processing when recording a video file from a tiny sensor can make the difference between consumer birthday-cam and professional viability, but that is exactly the power of log. Apple Log has catapulted the iPhone into filmmaking legitimacy. Apple Log footage looks flat and low-contrast, until you color correct it, as I’ve done here using my free Prolost Apple Log LUTs. They Used Big Lights: Does Matter, With An Asterisk Apple’s event was set at night, with a dark, Halloween-inspired look. It takes a lot of professional lighting gear to illuminate a wide shot of Apple’s campus, and professional skill to balance this lighting with the practical sources on the building itself. You can do this for cheaper than it looks. Lighting matters more than any camera, more than any lens. As I wrote in 2009: Photos are nothing but light — it’s literally all they are made of. Timmy’s birthday and Sally’s wedding are reduced to nothing but photons before they become photographs. So getting the light right is more meaningful to a photo than anything else. Should you look at the giant lights in Apple’s video and feel dejected that your own productions will never afford this level of illumination? I say no, because a) you’re probably not lighting up the whole side of an architectural marvel, and b) you’re probably not designing your production around one of the world’s highest-paid CEOs. For Tim Cook’s appearance, Apple’s production had their giant LED light panels on camera dollies, which is not typical. The two reasons I can image they did this are to be low-impact on the campus itself (rubber wheels instead of metal stands), and to be able to adjust the lighting quickly out of respect for Cook’s valuable time. It makes the lighting rigs seem more complex than they really are. What they really are is big, bright, and soft. And rather minimalistic — mostly key, a bit of fill. Big, soft LED lighting is actually quite affordable these days. I have two medium-power bi-color lights from Aputure, and together they cost less than my iPhone 15 Pro Max. I couldn’t cover Cook’s opening wide shot with them, but I could get close. That big softbox overhead? Now that’s expensive. I might also be willing to compromise on my ISO settings to work with a smaller lighting package, where Apple seemingly was not. More on this below. So the lighting is important, but the quantity of it and the support gear it’s on is specific to this rarified type of time-is-money, night-exterior production. Don’t be distracted by the extra equipment, focus on the fact that the lighting itself is actually rather spare. They Attached the iPhone to Cranes and Gimbals and Drones and Dollies: Does Not Matter, Except for One Little Thing The behind-the-scenes video is almost comical in its portrayal of the iPhone gripped into all manner of high-end support gear. You do not need any of this stuff. I mean, every filmmaker needs a crane shot — but this is why small cameras are so empowering: everything is a crane when your camera weighs less than a Panavision lens cap! Check out this video from filmmaker Brandon Li. He uses a gimbal on a hand-held pole to create a perfect crane shot for the opening of his action short. Toward the end, he achieves a nifty top-down shot by... standing on a railing. All with a camera substantially more cumbersome than a phone. Director Brandon Li is his own crane. Get your kicks without sticks. Apple used cranes and remote heads designed for big cameras because that’s how they know how to shoot these videos. Apple’s marketing department is large, and knows exactly what they need on these productions. One thing they need is for a dozen people to watch the camera feed, making sure everything is committee-approved perfect. This is just the DIT cart, not even the client monitor. This kind of client-driven support gear compounds on its own requirements. As Tyler Stalman points out in his excellent breakdown video, some of what’s bolted to the iPhones is simply a metal counterweight so that a gimbal head, designed for a much larger camera, can be properly balanced. You can plug an external drive into the USB-C slot on the iPhone 15 Pro Max, or you can plug in an HDMI adapter for a clean client feed. If you want to do both, you need a USB-C hub, which at that point requires power. So now you’ve got an Anton-Bauer battery pack mounted to this tangle of gear. When you don’t have clients, you can skip all that and just shoot. This means you can replace most of the gear you see here with a cheap consumer gimbal — or a tripod. And here’s the key takeaway for this point: Apple achieved optimal image quality from the iPhone in a number of ways, and one, I’m betting, was by turning off image stabilization — which is only advisable when this tiny camera is physically stabilized. So you don’t need all the stuff Apple used, but if you want comparable results, you need a way to mount your iPhone to something solid. Maybe not a whole powered cage, but certainly a simple tripod mount. Then you can eek out that last bit of extra image quality by turning off image stabilization — which brings us to our next point: They Used the Blackmagic Camera App: Matters as Much as Log I don’t think this exact rig was used to capture what we saw in the video, but I believe the settings are representative. The Blackmagic Camera app has the option to turn off image stabilization, yes, and also like a million other features. Manual, locking exposure control is the top of the list, but there’s a ton more. The app includes a false-color mode to help match exposure from shot to shot. It can load a preview LUT, so you can shoot log but view something closer to what the audience will see. It’s silly to be grumpy with Apple for not offering this power in their own camera app when they clearly worked with Blackmagic Design to have this app available day-and-date with the iPhone 15. Oh, and it’s free. They Used a 180º Shutter: Matters More Than You Think One slick feature of the Blackmagic Camera App is that you can choose to express the shutter speed in degrees, like a cinema camera, rather than fractions of a second, which is more typical in stills. A 180º shutter — where the shutter is open for half the duration of a single frame, e.g. 1/60th of a second at 30 fps — is important for a pro look. Anything slower and you get smeary blur and a camcorder look. Anything faster and your footage looks like you shot it on a phone, because 99% of the time our iPhones are using insanely fast shutter speeds to handle typical daylight. Look at any of your own daytime iPhone video — I’d be surprised if you see any motion blur at all. Compare Apple’s motion blur to mine. 180º shutter vs. probably something more like 1º. Relatedly: They Shot at ISO 55: Matters to Apple’s Goal of Maximum Image Quality Here’s where the level of professional control over the lighting starts to really matter: If Apple decided that they must shoot at ISO 55 (the lowest, although possibly not the native ISO of the 1x camera) for the highest image quality, and with a 180º shutter for the most pro-camera look, that means they have no other control over exposure. The iPhone 15 Pro 1x lens does not have a variable aperture, so shutter speed and ISO are your only exposure controls. When shooting in uncontrolled environments, the typical method of limiting the amount of light entering the lens is via ND filters, sometimes variable ND filters. I don’t see any evidence that Apple used filters on this shoot, which would fit with their overall prioritization of image quality over all else. So this goes back to lighting — Apple’s team controlled that lighting perfectly, because they opted out of any exposure control they might have had in-camera. I'm curious to learn more about this setting though. YouTubers Gerald Undone and Patrick Tomasso did some tests and found that the best dynamic range from the iPhone 15 Pro came from ISO 1100–1450, with 1250 being their recommended sweet spot. Did Apple prioritize low noise over dynamic range? They shot 30p: Doesn’t Matter Apple has used 30 frames-per-second for these pre-recorded keynotes since they started in September of 2020. They’re not trying to be “cinematic,” they’re trying to make a nice, clean video that can take the place of a live event. 30p is a choice, and a fine one for an on-stage presentation. You might choose 24 or 25 fps for a more narrative look, and that’s great too. Note that Apple’s native Camera app offers 30.0 and 24.0 fps, but the Blackmagic Camera app adds support for 29.97 and 23.976 fps, which are the actual broadcast frame rates Apple uses for their productions. They Focused Manually: Doesn’t Matter The Blackmagic Camera app truly has a dizzying set of features, some seemingly part of an attempt to win some kind of bizarre bet. Like support for wireless external follow-focus controls? I mean, wow, but also, really? Sure makes for a cool behind-the-scenes shot, but I bet you can live without this. They Used a Matte Box: Does Matter While Apple did not attach any additional lenses to their production iPhones, they did put stuff in front of the built-in lenses — notably teleprompters, of course, and comically-large matte boxes. Matte boxes might feel like affectations in this context, but shielding that tiny lens from glare is actually a significant way to improve overall image quality. Luckily, you don’t really need a full-on matte box to do this. A French flag will do it, as will your hand. If a bright light is dinging your little lens, you’re leaving a ton of image quality on the floor. They Exclusively Used the 1x Camera: Does Matter — to Apple The 1x camera, with image stabilization turned off, gives the highest-quality image available from the iPhone 15 Pro Max. Are you detecting a theme here? Apple imposed a number of limitations on how they used the iPhone camera, seemingly always in the name of maximizing image quality. As we’ll discuss below, you may or may not share this priority. They Edited in Premiere Pro? Doesn’t Matter Real editor, fake set. Eagle-eyed viewers noticed that an Adobe Premiere Pro timeline appears behind the editor of “Scary Fast,” not Apple’s own Final Cut Pro. But we’re not in a real edit suite here — we’re actually on one of the sets from the production. Did Apple really edit in Premiere? I have a feeling that this and Stefan Sonnenfeld’s interview were staged on Apple’s standing sets rather than in real studio environements to keep the production close to home — for cost, control, and secrecy reasons. So let’s assume Apple really did cut in Premiere. This means next to nothing. All editing software does the same job, and it’s unlikely Apple would impose a workflow on the production company they hired to both shoot and post-produce the video. Other than of course to ensure that it be cut on a Mac. It’s interesting to note that Apple’s Pro Workflows Group, representatives from which are interviewed in the behind-the-scenes video, are a part of the hardware division at Apple. Their charter is to promote and support professional use of Apple devices, regardless of which software they’re running. Should Final Cut Pro users be nervous that Apple might send it to live on a farm with Shake and Aperture? It’s hard to regain our trust here, but Apple did just release a very nice version for iPad a few months ago, and substantial updates to that and the Mac version just yesterday. So there’s really nothing to see here. Move along. They Colored in Resolve: Doesn’t Matter Apple hired Company 3 to produce this video. Company 3 is best-known as a color house. In my visual effects and commercial directing career. I’ve worked with several amazing colorists there, from Dave Hussey to Siggy Ferstl to their CEO, Stefan Sonnenfeld, who is prominently featured in the behind-the-scenes. Stefan is one of the most prolific, talented, and well-known colorists working today. I modeled half the presets in Magic Bullet Looks after his famous grades on films like Transformers, 300, John Wick, Man on Fire, and hundreds more. Real colorist, not his real office. If you can get Stefan to color your video, that’s what matters — not the tool he uses. Resolve is “free” (with many asterisks), but a session at Company 3 is four-figures per hour. Whether you use Resolve, Magic Bullet, or whatever else, what matters here is that shooting log means color grading is not just possible, but essential, and great care was taken with this part of the process. This All Makes Sense. Why Do I Still Feel Weird About it? As much as I might disagree with an accusation that Apple was disingenuous to say “Shot on iPhone” about a massive production with seemingly unlimited resources, I understand where this feeling comes from. “Shot on iPhone” carries with it the implication of accessibility. We are meant to be inspired by this phrase to use a tool that we already carry with us every day to capture videos that might transcend mere records of our life’s moments, and become part of our artistic pursuits. And we should absolutely feel that way about the iPhone 15 Pro and Pro Max. The reason we feel slightly disconnected from Apple’s impressive exercise is not that they were dishonest — it's that their priorities were different from ours. Apple wants to sell iPhones, and to accomplish this, they spared no expense to put the very highest image quality on the screen. As a filmmaker, you care about image quality, but you care about other things too — probably more. Sticking to the 1x lens gives the best image quality, sure, but choosing the right lens for the story you’re telling might matter more to a filmmaker. You’ll probably gleefully use all the focal lengths Apple supplied. As much as you might value the clean image that comes from shooting at ISO 55, you might get more production value for your budget by using smaller lights (or available light!) and accepting some noise at higher ISOs. You might truly appreciate the value of a 180º shutter, and simply not always have a camera case/cage that allows you to mount a variable ND filter to your telephone. I ordered one from PolarPro the day the iPhone 15 was released and it still hasn’t shipped, so I’ve shot next to no 180º shutter footage so far. You might well understand that turning off image stabilization will improve your image — unless your shot is now wobbly, because you’re shooting handheld out the window of a moving car. So maybe you’ll leave stabilization on, and be a human crane, or gimbal, or dolly, or all of the above. Never use the 5x lens. Never use image stabilization. Never shoot through a dirty windshield. And never get this dope-ass shot. Let’s be honest: If image quality is your top priority, there are much better options for your next production than a consumer telephone. You’d probably choose the iPhone more for its accessibility, nimbleness, ubiquity, and low cost. Those are great reasons, and when you pair them with image quality that can be mistaken for high-end cinema camera footage by a veteran colorist, you’ve got something magic. Give It To Me In Bullets You Long-winded Monster So we may well ignore much of Apple’s implied advice, but we would do well to follow some of it if we can: Use camera support. Not crazy camera support, but some. Use lights. Not crazy lights, but some. Use a camera app that allows manual control, like Blackmagic Camera Use 180° shutter, if you can (ND filters will help) Keep light off of the lens using a $8,000 matte box or a bit of black tape Hire literally the world’s most famous colorist. Or just do some color correction. And most importantly, shoot in log, with a good preview LUT The incredible production apparatus of my helicopter tunnel shot from my last post. Shot on iPhone Means What it Means to You All of this is academic if you don’t go put it into practice. If you got this far and feel empowered to wring the most out of your iPhone 15 Pro with just the right amount of gear, that’s great. If you actively forget all of this and occasionally flip on the Log switch so you can play with the color of your iPhone videos in post, that’s great too. Because here’s the thing: movies have already been shot on phones. No production’s decisions validate any camera for all other production needs. You decide what “Shot on iPhone” means to you, if anything. And the way you decide is by getting out there and shooting something. I went to Peru with nothing but my iPhone 15 Pro Max. I shot some stuff with zero gear, and I’m having a blast color grading it various ways.
More in creative
Complaining is a cultural phenomenon, but it’s particularly prevalent in societies with a consumer culture (the customer is always right) and those where comfort is coming to be expected. Given all the complaining we do (about the weather, leadership, products, service and various ailments), it’s worth taking a moment to think about why we complain. […]
Peter Drucker didn’t say “culture eats strategy for breakfast,” but reality rarely gets in the way of a good quote. But what does it mean? I think what ‘not Drucker’ meant was that MBA tactics will always be subverted by the power of systems, and that systems disguise themselves as culture (“what are things like […]
This week I’m in The UnPopulist with an article about the politics of the abundance agenda:
Are more famous figures dying, and if so, why?