More from Prolost
The new Kino app recording ProRes Log with a custom preview LUT. Yes we’re still talking about shooting video on iPhones. But I also want to talk about digital cinema shooting in general, in a world where top camera makers are battling to give filmmakers everything we want in a small, affordable package. How does the DV Rebel spirit — born of camcorders and skateboard dollies — live on in a time of purpose-built digital cinema cameras that fit in your hand? For me, it’s meant keeping my rigs small and manageable. I love gimbals and drones and lidar focus rigs, but I’m happiest when my whole rig — tripod and all — can be picked up and moved with one hand. We suffered through so many years of too little camera, but now it’s quite easy to have way too much camera. I want to live in that sweet spot where image quality is not compromised, but I can still move fast. Turns out the same is true of camera apps. Photo by Karen Lum The Goldilocks of iPhone ProRes Log The built-in iPhone camera app is too little camera for shooting ProRes Log. There’s no preview LUTs, no manual adjustments, and the viewfinder is obscured by the controls. The Blackmagic Camera app is a truly wonderful gift to the Apple Log shooter, addressing all these issues. The only feature it lacks is the ability to pick it up in a rush and quickly make great-looking video. So, confession time: I almost never use it for anything other than the most controlled studio shooting. My Peru and Taiwan travel clips? Almost entirely shot with the native camera app. There’s a massive gulf between the too-little Apple app and the too-much Blackmagic offering. And my friends at Lux, makers of the wonderful Halide still photo app for iPhone, have created a new video app that lives squarely in this sweet spot. Kino Choose from built-in LUTs, or load your own. Bake them in, or just preview through them as you record uncorrected log. Kino is an app you can pop open and start shooting with right away, with just enough control to maximize the final quality of your iPhone video. It’s fast, fun, and a joy to use thanks to its thoughtful and beautiful design. Out of the (skeuomorphic) box, Kino is basically an app that shoots better-looking video on your recent iPhone. You can choose from professionally-designed LUTs to dial in the look you want. On iPhone 15 Pro and Pro Max, Kino becomes a log-shooting powerhouse. You can choose whether to bake the LUT into your footage or not, and you can add your own LUTs. Maybe my favorite feature is Auto Motion. While Apple’s camera app prefers fast shutter angles, Kino tries to keep you as close to a cinematic 180º shutter speed as possible. Kino runs the spectrum from consumer app that just makes iPhone video look better to professional control and options. It’s the perfect DV Rebel video app. Prolost Brand LUTs Kino features LUTs created by filmmakers you’ll recognize, but the honor of supplying the most boring LUT fell to me. The Neutral LUT is none other than my Prolost “TECH” LUT that I explained here. Point and Shoot Experience, Pro Results Accessibility is core to the DV Rebel ethos, and along with that came a focus on tuning not-quite-professional gear to achieve cinematic results. Years ago, this was essential for the indie filmmaker, as professional gear was truly unattainable. But even when professional tools are abundant and affordable (seriously, what a time to be a filmmaker!), sometimes the right camera for the job is the one that feels great to shoot with. And the same is true for camera apps. SHOTWITHKINO.COM
Still from Apple’s “Let Loose” video. Apple unveiled their new line of iPads yesterday in a pre-recorded video titled “Let Loose.” As with the previous “Scary Fast” MacBook Pro launch video, “Let Loose” ends with a tag proclaiming “Shot on iPhone” — this time adding “Edited on Mac and iPad,” and the fine print: “All presenters, locations, and aerial footage shot on iPhone.” During the live stream I actively wondered if the iPhone acquisition of “Scary Fast” had been a one-time thing. “Let Loose” looks great, as all Apple videos do, but some shots featured a shallower depth-of-field than is possible with an iPhone-sized lens and sensor combo. At the end of the event, I wondered publicly on Threads about this. Replies speculated about additional lens rigs, some improved version of Cinematic Mode, or maybe blurring the background in post. Panavision Lens Relay System After Apple released a behind-the-scenes video about the production of “Scary Fast,” the Internet did its internet thing and questioned the “Shot on iPhone” claim, as if “Shot on iPhone” inherently means “shot with zero other gear besides an iPhone.” These takes were dumb and bad and some even included assertions that Apple added additional lensing to the phones, which they did not. But for “Let Loose,” they did. “Let Loose” was shot on iPhone 15 Pro Max, and for several shots where a shallow depth-of-field was desired, Panavision lenses were attached to the iPhones using a Panavision-developed mount called the “Lens Relay System.” This rig is publicly available for rent from Panavision today, although not currently listed on their website. There’s just enough shallow focus in these shots to make me wonder how they could be shot on iPhone Cinematic Mode could never. What’s a “lens relay system?” Think of a telescope. Instead of focusing an image on a plane of film or a sensor, it creates what’s known as an “aerial image” that you capture with another lens system — your eye. If you’ve ever smashed your phone up to a pair of binoculars successfully, you’ve made a lens relay system. I used a Frasier lens relay system from Panavision to get small lenses into tight spaces for a Ruby Tuesday commercial I directed. The Frasier is a pericope lens, one physical unit that contains both the taking lens and the capture lens. With Panavision’s new system, the iPhone’s own lens captures the areal image created by any Panavision lens you like. The iPhone provides the image capture, in ProRes Apple Log, of course. There are, of course, many systems for mounting lenses to iPhone (Beastgrip is working on a new one), but it’s certainly notable that Panavision made one, as they exclusively serve a market of serious professional filmmakers. Why Lenses? If “Scary Fast” could be shot without add-on lenses, what does Panavision’s rig bring to the table? Apple Log allows the iPhone to capture highly-mailable, 10-bit ProRes footage that fits into a professional pipeline alongside footage from high-end cinema cameras, but the one thing it can’t do is capture the shallow depth-of-field that we associate with high-end productions. The look of “Let Loose” is a collaboration between the iPhone’s clean capture, the focus control of the Panavision lenses, and top-tier color grading. Don’t You Start Internet So is it fair to say “Let Loose” was “Shot on iPhone” if it was done with the help of gear the average iPhone owner could never afford? Of course it is. Feel free to re-read this: What Does and Doesn't Matter about Apple Shooting their October Event on iPhone 15 Pro Max — but in short, the fact that Apple can drop an iPhone into an otherwise-unaltered professional film production and match the look of their previous videos without anyone even noticing is meaningful. In fact, “Let Loose” is the first Apple Event finished and streamed in HDR, pushing the iPhone’s capture abilities even further than “Scary Fast.” You don’t need to add cinema lenses to the iPhone to make great-looking images, but the fact that you can is cool. You also don’t need to twist yourself into knots wondering why you might choose an iPhone over a professional cinema camera when you have a Panavision lens budget. Personally, I’m more excited about the run-and-gun possibilities — and a vote of confidence from the most elite cinema rental house only bolsters the story of iPhone as professional video camera. Or think of it this way: Apple confidently intercut footage shot with the most elite cinema lenses available with footage shot with unadorned iPhone lenses. Panavision for sure. But I;m betting no Panavision here. The Real Story Here is Apple Log None of this would be possible without Apple having adding Log ProRes recording to the iPhone 15 Pro. Log is to video what raw is to still photography, and the story of how Apple Log transforms the definition of “Shot on iPhone” from a dalliance to a responsible, even desirable strategy for filmmaking is still ongoing. Truly “pro” features like color-accurate OLED screens on iPads and ProRes Log on phones don’t just sell a few devices to a few filmmakers. They preserve and elevate Apple’s reputation as the choice of creative professionals in all fields. Apple hardware is vastly overpowered for most of its customer’s uses, so as Apple looks around for folks in need of their very best, they find the Zbrush artist, the Redshift renderer, and now, improbably-but-deservedly, the professional cinematographer.
This is a blog post about a video, which is about new color-conversion LUTs for Apple Log footage from the iPhone 15 Pro and Pro Max (updated from my first set). The video is also a mini-travelogue of my recent trips to Taiwan and Peru. This post dives a bit deeper into both the LUT workflows, and my state of mind about shooting digital-cinema-grade footage with a device I always have with me. There’s a lot going on here. Conflicted in Peru Me relaxing on vacation. Photo by Forest Key. I always have a moment when packing for a trip: Which camera to bring? Which lenses? I know I’m always happier when I pack less, like just a single prime lens. But sometimes FOMO gets me and I pack three zooms. For my trip to Lima, I brought my Sony a7RIV with the uninspiring-but-compact, Sony 35mm F2.8 prime. I lugged it around for a few days, but wasn’t really feeling it. Meanwhile, my iPhone 15 Pro Max was calling to me with its ProRes Log video mode. “I’m 10-bit!” It would say. “Think of the fun you’ll have color grading me!” I told my phone to shut up, and proceeded to shoot very little with it — or my Sony. Like a squirrel in the middle of the street, drawn in two different directions at once, I creatively froze. Photography, for me, is made up of a lot of habits, and shooting iPhone video with aesthetic intent is just not yet baked into my travel muscle memory. Made in Taiwan A month later, I took a family trip to Taiwan, one of my favorite places in the world. I’d had some time to process my Peru deadlock, and decided to stop judging my own creative impulses, and let inspiration guide me in which camera I pulled out. I wound up shooting a lot of video. Me relaxing on vacation. Photo by Josh Locker. I loved shooting ProRes Log in Taiwan with the iPhone 15 Pro Max. I’d occasionally reach for Blackmagic Camera, but I often just used the default camera app. I stuck my phone (with its crumbling case) out of taxi sunroofs and skyscraper windows, held it above teeming crowds and shoved it between chain-link fences. Seeing the broad dynamic range I was capturing in scenarios from noontime sun to neon-lit nights got me excited about grading the footage later. It’s exactly the way I feel about shooting raw stills with my Sony, knowing that I’ll be able to go crazy on them in Lightroom. The photographing act is just half of the process. Step through the frames below to see how color transforms a single shot from the video above: Uncorrected Apple Log Straight out of the camera. I mean, phone. Look & LUT An overall correction applied under PL-HERO LUT. Local Corrections Various windowed corrections under the Look help guide the viewer’s focus and give the natural-light capture a cinematic feel. LUTs, Looks, and Magic Bullets There’s been a bit of a gold rush of people hawking creative LUTs that apply a particular “look” to iPhone Log footage. My day job is, in part, helping make color tools like Magic Bullet Looks, which can do so much more than any LUT. Creative LUTs are great, and by all means support the folks making them — but that’s not what my iPhone LUTs were or are. The Prolost iPhone LUTs convert Apple Log to various other color spaces, and support three kinds of workflow: Grade Under a Display Transform LUT Apple Log is a totally decent color space to work in, so color correcting Apple Log can be as simple as applying Magic Bullet Colorista and choosing one of my Monitor & Grade LUTs. That’s what you see me doing in the video above. Colorista (set to Log mode) does its work on the native Apple Log pixels, and the LUT converts the result to look nice on video. Many other systems work like this, including LumaFusion, which ships with Prolost Apple Log LUTs. The key is color correcting under the LUT. Bring Apple Log into an Existing Workflow Color work is often done in some intermediate color space. This is usually some kind of wide-gamut, log format, such as Davinci Wide Gamut/Intermediate, or one of the ACES log spaces. The Prolost ACES LUTs simply convert Apple Log to either ACEScc or ACEScct log, allowing you to grade your iPhone footage alongside any other professional camera, and output them all through the same pipeline. Shooting Through a LUT The Blackmagic Camera app allows you to load any LUTs you want and view through them without baking them into your footage. With my LUTs, you can shoot with the same LUTs you grade under later, for a truly professional (no joke!) workflow. The real stars of this update though are the FC LUTs. They add an informative False Color overlay to the Shoot/Grade LUTs, making sure you always nail your exposure. Watch the video to see them in action. I already can’t imagine shooting without them. These LUTs work well in Blackmagic Camera or even on an external HDMI monitor. Adjusting exposure with a variable ND filter until the 18% gray card lights up yellow, for perfect exposure. PL-HERO-FC LUT in Blackmagic Camera. Gathering Resolve I’ve never edited a whole actual thing in Resolve before. As if this video wasn’t enough work already (I shot the a-roll in mid-December), I decided to use it as a test case for creative editorial in DaVinci Resolve. It’s the ACES LUTs that allowed me to incorporate Magic Bullet Looks into my Resolve color workflow. Maxon just shipped a really nice update to Magic Bullet Looks, with simplified color management made possible by more and more apps we support doing darn fine color management at the timeline level. So in Resolve, I can use my LUT to convert Apple Log to ACEScc, and then apply Magic Bullet Looks, which can now be set to work in ACEScc with a single click. The new streamlined color options in Magic Bullet Looks. Choose Custom to get the full manual control. I can sneak additional Resolve corrector nodes in between those two for local corrections. Resolve is great at this, and Looks is great at creative look development, so this is a match made in heaven. A little face lift. Then, at the end, I use an ACES Transform node to convert to Rec. 709 video. Get to the chopper. An expert Resolve user could replace my LUTs with Resolve’s built-in Color Space Transform nodes, but the LUTs make this process easier and more reliable. Gear Inspires Every photographer knows the feeling of lusting after new gear. We know it so well that we remind ourselves constantly that “next camera syndrome” is debilitating, and that “most cameras are better than most photographers.” Gear is not the answer. Go shoot. There is, however, a counterpoint to these truths: As shooters, we take inspiration where we can get it. And sometimes a new technique, a new locale, or even, yes, a new bit of gear is what provides it. The key is to listen for that inspiration, and don’t judge it. Even if it’s coming from your phone. Jiufen village, Taiwan Get the Free LUTs
Still frame from Hello! by Goro Fujita, created in VR using Quill Today’s the day to pre-order Apple Vision Pro, Apple’s first “spatial computing” device. It’s an expensive VR headset that either represents an opportunity to beta-test the future, or double down on past failings of VR promises. I’m a VR skeptic in many ways. I don’t want anything to do with “the metaverse,” and I’m beyond annoyed by fake demos of people pretending to do creative work by waving their hands around like a wannabe Tony Stark. But I’ve owned several Oculus/Meta headsets, and I’ve had some experiences in VR that keep me coming back. Will Apple Vision Pro become an indispensable productivity tool? I have no idea. But I do have a short list of experiences that I’d like to have with it. Tranquil Games that Reward Acrophilia My favorite game on Meta Quest is Daedalus from Vertical Robot. In it, you loft yourself through massive brutalist architectural spaces, with lightweight puzzles standing between you and the end of each level. There’s just enough problem-solving to keep you occupied, and a wonderful sense of flying through beautiful spaces. I’ve played through it twice and look forward to playing it again. Daedalus Land’s End was an early VR game from UsTwo Games, famously the creators of Monument Valley, maybe my all-time favorite iOS/iPadOS game. Land’s End was designed to require no controllers, offering a unique gaze-based interaction model that would be a good fit for the controller-free Vision Pro. These games are not photo-realistic, or action-packed. They don’t have any people in them, or any elaborate effects. In fact, both were designed for early VR systems with limited graphics, controllers, and range of motion. But they create a profound sense of place with their elegant graphics and sound. And they both play with the fun sensation of vertigo — maybe VR’s best party trick. I think of them as places I enjoyed visiting, and to which I’d like to return. Just Toot Around Blade Runner a Bit I’ve never “played” Aircar, but just look at it. There was a Blade Runner 2049 promotional app for the Oculus Go that was never ported to the Quest line, and I still miss it in all its clunky glory. Who doesn’t want to fly around aimlessly through a stormy cyberpunk cityscape? Aircar by Giant Form Entertainment Be In Star Wars Lucasfilm’s Industrial Light & Magic has done some great work in VR out of their ILM Immersive group (formerly ILMxLAB). Vader Immortal is a three-part series that promises a virtual lightsaber duel with the Sith himself, but delivers far more than that, especially in its quieter moments. VR is very good at imparting scale, and Star Wars is full of big, cool stuff. Seeing a full-sized Star Destroyer fly over your head brings the scope of the Star Wars universe to life in a simple but powerful way. As a lifelong Star Wars fan, of all the rides and experiences at Disneyland Park in California, my favorite thing to do is just be in “Galaxy's Edge” — the official name for what I always just call “Star Wars Land.” There’s a full-sized Millennium Falcon there, and just being near it is better than any ride. I’ll grab a snack and sit where I can see it, and just be in Star Wars for a bit. It’s lovely. In ILM Immersive’s follow-up Star Wars title Tales from the Galaxy’s Edge, there’s a moment where you’re in that exact fictional spaceport, but seeing it from a vantage not afforded at Disneyland. It’s a neat bit of world-building, and also a nice semi-direct comparison of experiences. I can only go to Disneyland so often — but with a VR headset, I can just be in Star Wars, standing near the Millennium Falcon, any time I want. That’s me eating my Ronto Wrap in the background. But honestly, one of my favorite moments is a simple and stupid one that comes early in the game. You’re walking down a hallway of a rusty space freighter, looking out the window at the massive planet below. A sound brings your gaze down to see that a “Gonk Droid” has brushed past you. I went from marveling at the scope of the world to giggling with delight at seeing this dear old friend from my childhood, who has about 20 seconds of screen time in A New Hope, and probably an original series soon on Disney Plus. For all the effort ILM Immersive has put into Vader Immortal and Tales from the Galaxy’s Edge, I would actually just love a “Star Wars Studio Tour” app. Let me walk through virtual editions of the sets from the original trilogy — and then wave my hand and have them seamlessly become the fully-fleshed-out fictional settings. Be Real (High) Places BRINK Traveler. It’s great. Save Money on Contractors with Sketchup I went too far. We recently remodeled a bathroom in my house, and of course I built the whole thing in Sketchup, then imported it into Cinema 4D and rendered it in Redshift. What I kept wanting to do, however was walk through the space in VR. Pretty much the day we finished construction, Sketchup released a VR viewer app. It’s not even that great, but it’s great. In fact, I used it to create an interactive maze “game” for my family. Nothing moves or happens in it, but it’s a cool little experience that was easy to create. Movies, Sure, But Not Many 3D Movies 3D movies will be amazing on Vision Pro, they say. Finally we can watch 3D movies without any of the technical limitations of uncomfortable glasses or dim projection. Except we have to watch them alone, with heavy goggles. 3D doesn’t make movies more immersive. It’s usually distracting and dumb, and most filmmakers do it wrong. Eliminating the technical presentation issues won’t change that. But I’ll definitely watch some regular old movies on this thing. $4,000 is cheap for a home theater — even one you have to use alone. Appreciate (and Maybe Make?) Immersive Art I’m not interested in VR as a filmmaker. I love movies the way they are, and every experience I’ve had with narrative VR video has reinforced that opinion. But when 360º storytelling is taken seriously as its own art form, more akin to interactive theater than to movies or games, it can be quite compelling. An artist I admire in this arena is Goro Fujita. He’s done some beautiful pieces using the Quill app, which is was briefly owned by Meta. In the early days of the Meta Quest, Quill Theater was a neat was to see fresh, fun creative expression in your headset on a regular basis. As with the games I mentioned above, many of the Quill pieces I’ve experienced stick with me to a surprising degree. My memory treats them like places I’ve visited, and occasionally calls me to return. Rusty Ships by Goro Fujita Now that Quill is once again in the hands of its original developer, I hope it, or something like it, can thrive on Apple Vision Pro. I’d like to create with it, but I’m also excited to see what others make. If there’s a fresh, compelling immersive experience created by an artist, not a mega-brand, awaiting me every time I put on the Vision Pro, I’ll reach for it every day. Curious to learn more? Fujita gave a long and excellent talkhttps://youtu.be/7YVUqcIJU6w (seemingly conducted in VR) on interactive storytelling. The Biggest Smallest Product Release in Apple History The Vision Pro is too expensive to be popular, and too important not to command our attention. Major companies are opting out — for now. Indie developers are jumping in. It could wind up feeling like the early days of the iPad, where we all figure out how we want to use this new thing together. I’m excited to have some memorable experiences in Apple Vision Pro. But the creative tools have to be there to make these experiences personal.
More in creative
Complaining is a cultural phenomenon, but it’s particularly prevalent in societies with a consumer culture (the customer is always right) and those where comfort is coming to be expected. Given all the complaining we do (about the weather, leadership, products, service and various ailments), it’s worth taking a moment to think about why we complain. […]
Peter Drucker didn’t say “culture eats strategy for breakfast,” but reality rarely gets in the way of a good quote. But what does it mean? I think what ‘not Drucker’ meant was that MBA tactics will always be subverted by the power of systems, and that systems disguise themselves as culture (“what are things like […]
This week I’m in The UnPopulist with an article about the politics of the abundance agenda:
Are more famous figures dying, and if so, why?