Apple Shot Their “Scary Fast” October Event Video on iPhones And We Had Feelings
You’re somewhere on the spectrum of occasionally shooting video on your iPhone to a professional-ish video maker with some gear, and you see at the end of Apple’s October “Scary Fast” event announcing new Macs with M3 silicon that the entire event was “Shot on iPhone.”
This makes you feel a certain way.
Then Apple posts a behind-the-scenes video showing how this was done, revealing a rare and imposing glimpse into the scale and scope of their industry-leading launch videos. At the center of it all, instead of their customary Arri Alexa (a digital cinema camera costing $35–150K before you even add a lens, used to shoot everything from Avengers: Endgame to Barbie), was an off-the-shelf iPhone 15 Pro Max, gripped into truckloads of professional support gear.
At this point, some folks felt differently about what was implied by “Shot on iPhone.” There have been bad takes on this, and good takes on those bad takes.
Anyone who knows the tiniest bit about video production knows that the camera is a small, but important, but small, part of the overall production picture. “Shot on iPhone” doesn’t promise “and you can do it too” any more than Stanley Kubrick lighting Barry Lyndon with candlelight means anyone with candles can make Barry Lyndon.
But when the camera is the least expensive piece of gear on the set after clothespins and coffee, it does feel strange. I’ve been on a lot of productions like this, having played an active role in the DV filmmaking revolution of the late ’90s-to-early-2000s. It was an odd feeling to scrounge for the adapter plates required to mount a $3,000 DV camcorder purchased at Circuit City to a Fisher dolly that literally has no purchase price.
Apple, of course, has no burden of best-practices logic for their decision to shoot their “Scary Fast” event on iPhone — it’s a marketing ploy, a cool stunt, and a massive flex. A thing to do for its own sake. In the filmmaking community, it was the mic drop of the year. We greedily soaked up all the details in the behind-the-scenes video, and made a hundred tiny calculations about which aspects of this lavish production actually mattered to the question of the iPhone 15’s validity as a professional camera, and which did not.
With all that gear and production support, which aspects of the event really matter to you, the iPhone-curious filmmaker? What can you learn, and which aspects can you safely ignore?
Let’s take it one at a time:
That They Did It: Does Matter
As camera features have played a larger and larger role in Apple’s marketing for new iPhones over the years, you might have begun to feel a bit of cognitive dissonance. Apple tells you about how great, and even “pro,” these new iPhone cameras are — but would never have dreamt of using them to shoot their own videos or product stills. Apple was effectively saying “pro enough for you, but not for us.” Valid, but a bit dissatisfying.
Apple has set the aesthetic bar impossibly high with these pre-recorded events. They’re not just executives teleprompting in front of Keynote slides — they feature “slice of life” moments shot on huge sets and real locations. Elaborate visual effects transition between locations and settings that might be partially virtual. These videos have looked great ever since Covid pushed Apple to find an alternative to executives-on-stage-in-front-of-slides, and even as Apple is now once again able to welcome guests to in-person product launches, these lavishly-produced videos are the new gold standard in pitching the world on a new iThing.
With “Scary Fast,” Apple repeated their now well-established high-production-value playbook, but yoinked out the professional cameras and lenses, and dropped in a commodity consumer telephone in their place.
And crucially, none of us noticed.
It’s a big deal.
They Shot ProRes Log: Matters So Much It Would Be Impossible Without It
There is one single feature of the iPhone 15 Pro that made this stunt possible: Log. As I detailed here in words and video, the “iPhone video look” is designed to win consumer popularity contests, not mimic Apple’s own marketing videos, nor plug into professional workflows.
It may be hard to imagine that a slightly different bit of signal processing when recording a video file from a tiny sensor can make the difference between consumer birthday-cam and professional viability, but that is exactly the power of log. Apple Log has catapulted the iPhone into filmmaking legitimacy.
They Used Big Lights: Does Matter, With An Asterisk
Apple’s event was set at night, with a dark, Halloween-inspired look. It takes a lot of professional lighting gear to illuminate a wide shot of Apple’s campus, and professional skill to balance this lighting with the practical sources on the building itself.
Lighting matters more than any camera, more than any lens. As I wrote in 2009:
Photos are nothing but light — it’s literally all they are made of. Timmy’s birthday and Sally’s wedding are reduced to nothing but photons before they become photographs. So getting the light right is more meaningful to a photo than anything else.
Should you look at the giant lights in Apple’s video and feel dejected that your own productions will never afford this level of illumination? I say no, because a) you’re probably not lighting up the whole side of an architectural marvel, and b) you’re probably not designing your production around one of the world’s highest-paid CEOs.
For Tim Cook’s appearance, Apple’s production had their giant LED light panels on camera dollies, which is not typical. The two reasons I can image they did this are to be low-impact on the campus itself (rubber wheels instead of metal stands), and to be able to adjust the lighting quickly out of respect for Cook’s valuable time. It makes the lighting rigs seem more complex than they really are.
What they really are is big, bright, and soft. And rather minimalistic — mostly key, a bit of fill.
Big, soft LED lighting is actually quite affordable these days. I have two medium-power bi-color lights from Aputure, and together they cost less than my iPhone 15 Pro Max. I couldn’t cover Cook’s opening wide shot with them, but I could get close.
I might also be willing to compromise on my ISO settings to work with a smaller lighting package, where Apple seemingly was not. More on this below.
So the lighting is important, but the quantity of it and the support gear it’s on is specific to this rarified type of time-is-money, night-exterior production. Don’t be distracted by the extra equipment, focus on the fact that the lighting itself is actually rather spare.
They Attached the iPhone to Cranes and Gimbals and Drones and Dollies: Does Not Matter, Except for One Little Thing
The behind-the-scenes video is almost comical in its portrayal of the iPhone gripped into all manner of high-end support gear.
You do not need any of this stuff.
I mean, every filmmaker needs a crane shot — but this is why small cameras are so empowering: everything is a crane when your camera weighs less than a Panavision lens cap!
Check out this video from filmmaker Brandon Li. He uses a gimbal on a hand-held pole to create a perfect crane shot for the opening of his action short. Toward the end, he achieves a nifty top-down shot by... standing on a railing. All with a camera substantially more cumbersome than a phone.
Apple used cranes and remote heads designed for big cameras because that’s how they know how to shoot these videos. Apple’s marketing department is large, and knows exactly what they need on these productions. One thing they need is for a dozen people to watch the camera feed, making sure everything is committee-approved perfect.
This kind of client-driven support gear compounds on its own requirements. As Tyler Stalman points out in his excellent breakdown video, some of what’s bolted to the iPhones is simply a metal counterweight so that a gimbal head, designed for a much larger camera, can be properly balanced.
You can plug an external drive into the USB-C slot on the iPhone 15 Pro Max, or you can plug in an HDMI adapter for a clean client feed. If you want to do both, you need a USB-C hub, which at that point requires power. So now you’ve got an Anton-Bauer battery pack mounted to this tangle of gear.
When you don’t have clients, you can skip all that and just shoot. This means you can replace most of the gear you see here with a cheap consumer gimbal — or a tripod.
And here’s the key takeaway for this point: Apple achieved optimal image quality from the iPhone in a number of ways, and one, I’m betting, was by turning off image stabilization — which is only advisable when this tiny camera is physically stabilized.
So you don’t need all the stuff Apple used, but if you want comparable results, you need a way to mount your iPhone to something solid. Maybe not a whole powered cage, but certainly a simple tripod mount. Then you can eek out that last bit of extra image quality by turning off image stabilization — which brings us to our next point:
They Used the Blackmagic Camera App: Matters as Much as Log
The Blackmagic Camera app has the option to turn off image stabilization, yes, and also like a million other features. Manual, locking exposure control is the top of the list, but there’s a ton more. The app includes a false-color mode to help match exposure from shot to shot. It can load a preview LUT, so you can shoot log but view something closer to what the audience will see.
It’s silly to be grumpy with Apple for not offering this power in their own camera app when they clearly worked with Blackmagic Design to have this app available day-and-date with the iPhone 15.
Oh, and it’s free.
They Used a 180º Shutter: Matters More Than You Think
One slick feature of the Blackmagic Camera App is that you can choose to express the shutter speed in degrees, like a cinema camera, rather than fractions of a second, which is more typical in stills. A 180º shutter — where the shutter is open for half the duration of a single frame, e.g. 1/60th of a second at 30 fps — is important for a pro look. Anything slower and you get smeary blur and a camcorder look. Anything faster and your footage looks like you shot it on a phone, because 99% of the time our iPhones are using insanely fast shutter speeds to handle typical daylight. Look at any of your own daytime iPhone video — I’d be surprised if you see any motion blur at all.
Relatedly:
They Shot at ISO 55: Matters to Apple’s Goal of Maximum Image Quality
Here’s where the level of professional control over the lighting starts to really matter: If Apple decided that they must shoot at ISO 55 (the lowest, although possibly not the native ISO of the 1x camera) for the highest image quality, and with a 180º shutter for the most pro-camera look, that means they have no other control over exposure. The iPhone 15 Pro 1x lens does not have a variable aperture, so shutter speed and ISO are your only exposure controls.
When shooting in uncontrolled environments, the typical method of limiting the amount of light entering the lens is via ND filters, sometimes variable ND filters. I don’t see any evidence that Apple used filters on this shoot, which would fit with their overall prioritization of image quality over all else. So this goes back to lighting — Apple’s team controlled that lighting perfectly, because they opted out of any exposure control they might have had in-camera.
I'm curious to learn more about this setting though. YouTubers Gerald Undone and Patrick Tomasso did some tests and found that the best dynamic range from the iPhone 15 Pro came from ISO 1100–1450, with 1250 being their recommended sweet spot. Did Apple prioritize low noise over dynamic range?
They shot 30p: Doesn’t Matter
Apple has used 30 frames-per-second for these pre-recorded keynotes since they started in September of 2020. They’re not trying to be “cinematic,” they’re trying to make a nice, clean video that can take the place of a live event. 30p is a choice, and a fine one for an on-stage presentation. You might choose 24 or 25 fps for a more narrative look, and that’s great too.
Note that Apple’s native Camera app offers 30.0 and 24.0 fps, but the Blackmagic Camera app adds support for 29.97 and 23.976 fps, which are the actual broadcast frame rates Apple uses for their productions.
They Focused Manually: Doesn’t Matter
The Blackmagic Camera app truly has a dizzying set of features, some seemingly part of an attempt to win some kind of bizarre bet. Like support for wireless external follow-focus controls? I mean, wow, but also, really?
Sure makes for a cool behind-the-scenes shot, but I bet you can live without this.
They Used a Matte Box: Does Matter
While Apple did not attach any additional lenses to their production iPhones, they did put stuff in front of the built-in lenses — notably teleprompters, of course, and comically-large matte boxes.
Matte boxes might feel like affectations in this context, but shielding that tiny lens from glare is actually a significant way to improve overall image quality. Luckily, you don’t really need a full-on matte box to do this. A French flag will do it, as will your hand.
They Exclusively Used the 1x Camera: Does Matter — to Apple
The 1x camera, with image stabilization turned off, gives the highest-quality image available from the iPhone 15 Pro Max. Are you detecting a theme here? Apple imposed a number of limitations on how they used the iPhone camera, seemingly always in the name of maximizing image quality.
As we’ll discuss below, you may or may not share this priority.
They Edited in Premiere Pro? Doesn’t Matter
Eagle-eyed viewers noticed that an Adobe Premiere Pro timeline appears behind the editor of “Scary Fast,” not Apple’s own Final Cut Pro. But we’re not in a real edit suite here — we’re actually on one of the sets from the production. Did Apple really edit in Premiere?
I have a feeling that this and Stefan Sonnenfeld’s interview were staged on Apple’s standing sets rather than in real studio environements to keep the production close to home — for cost, control, and secrecy reasons. So let’s assume Apple really did cut in Premiere. This means next to nothing. All editing software does the same job, and it’s unlikely Apple would impose a workflow on the production company they hired to both shoot and post-produce the video.
Other than of course to ensure that it be cut on a Mac. It’s interesting to note that Apple’s Pro Workflows Group, representatives from which are interviewed in the behind-the-scenes video, are a part of the hardware division at Apple. Their charter is to promote and support professional use of Apple devices, regardless of which software they’re running.
Should Final Cut Pro users be nervous that Apple might send it to live on a farm with Shake and Aperture? It’s hard to regain our trust here, but Apple did just release a very nice version for iPad a few months ago, and substantial updates to that and the Mac version just yesterday.
So there’s really nothing to see here. Move along.
They Colored in Resolve: Doesn’t Matter
Apple hired Company 3 to produce this video. Company 3 is best-known as a color house. In my visual effects and commercial directing career. I’ve worked with several amazing colorists there, from Dave Hussey to Siggy Ferstl to their CEO, Stefan Sonnenfeld, who is prominently featured in the behind-the-scenes. Stefan is one of the most prolific, talented, and well-known colorists working today. I modeled half the presets in Magic Bullet Looks after his famous grades on films like Transformers, 300, John Wick, Man on Fire, and hundreds more.
If you can get Stefan to color your video, that’s what matters — not the tool he uses. Resolve is “free” (with many asterisks), but a session at Company 3 is four-figures per hour.
Whether you use Resolve, Magic Bullet, or whatever else, what matters here is that shooting log means color grading is not just possible, but essential, and great care was taken with this part of the process.
This All Makes Sense. Why Do I Still Feel Weird About it?
As much as I might disagree with an accusation that Apple was disingenuous to say “Shot on iPhone” about a massive production with seemingly unlimited resources, I understand where this feeling comes from. “Shot on iPhone” carries with it the implication of accessibility. We are meant to be inspired by this phrase to use a tool that we already carry with us every day to capture videos that might transcend mere records of our life’s moments, and become part of our artistic pursuits.
And we should absolutely feel that way about the iPhone 15 Pro and Pro Max.
The reason we feel slightly disconnected from Apple’s impressive exercise is not that they were dishonest — it's that their priorities were different from ours. Apple wants to sell iPhones, and to accomplish this, they spared no expense to put the very highest image quality on the screen.
As a filmmaker, you care about image quality, but you care about other things too — probably more.
Sticking to the 1x lens gives the best image quality, sure, but choosing the right lens for the story you’re telling might matter more to a filmmaker. You’ll probably gleefully use all the focal lengths Apple supplied.
As much as you might value the clean image that comes from shooting at ISO 55, you might get more production value for your budget by using smaller lights (or available light!) and accepting some noise at higher ISOs.
You might truly appreciate the value of a 180º shutter, and simply not always have a camera case/cage that allows you to mount a variable ND filter to your telephone. I ordered one from PolarPro the day the iPhone 15 was released and it still hasn’t shipped, so I’ve shot next to no 180º shutter footage so far.
You might well understand that turning off image stabilization will improve your image — unless your shot is now wobbly, because you’re shooting handheld out the window of a moving car. So maybe you’ll leave stabilization on, and be a human crane, or gimbal, or dolly, or all of the above.
Let’s be honest: If image quality is your top priority, there are much better options for your next production than a consumer telephone. You’d probably choose the iPhone more for its accessibility, nimbleness, ubiquity, and low cost. Those are great reasons, and when you pair them with image quality that can be mistaken for high-end cinema camera footage by a veteran colorist, you’ve got something magic.
Give It To Me In Bullets You Long-winded Monster
So we may well ignore much of Apple’s implied advice, but we would do well to follow some of it if we can:
- Use camera support. Not crazy camera support, but some.
- Use lights. Not crazy lights, but some.
- Use a camera app that allows manual control, like Blackmagic Camera
- Use 180° shutter, if you can (ND filters will help)
- Keep light off of the lens using a $8,000 matte box or a bit of black tape
- Hire literally the world’s most famous colorist. Or just do some color correction.
- And most importantly, shoot in log, with a good preview LUT
Shot on iPhone Means What it Means to You
All of this is academic if you don’t go put it into practice. If you got this far and feel empowered to wring the most out of your iPhone 15 Pro with just the right amount of gear, that’s great. If you actively forget all of this and occasionally flip on the Log switch so you can play with the color of your iPhone videos in post, that’s great too.
Because here’s the thing: movies have already been shot on phones. No production’s decisions validate any camera for all other production needs. You decide what “Shot on iPhone” means to you, if anything. And the way you decide is by getting out there and shooting something.