Tools

Slugline. Simple, elegant screenwriting.

Red Giant Color Suite, with Magic Bullet Looks 2.5 and Colorista II

Needables
  • Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony
  • Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic
  • TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM
  • The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    by Stu Maschwitz

Entries in Filmmaking (181)

Tuesday
Sep042007

The Film Industry is Broken

The film industry has a tremendous need right now for an open standard for communicating color grading information—a Universal Color Metadata format.

There are those who are attempting to standardize a "CDL" (Color Decision List) format, but it would communicate only one primary color correction. There are those trying to standardize 3D LUT formats, but LUTs cannot communicate masked corrections that are the stock in trade of colorists everywhere. There are those tackling color management, but that's a different problem entirely.

Look at the core color grading features of Autodesk Lustre, Assimilate Scratch, Apple Color, and just about any other color grading system. You'll see that they are nearly identical:

• Primary color correction using lift, gamma, gain, and saturation
• RGB curves
• Hue/Sat/Lum curves
• Some number of layered "secondary" corrections that can be masked using simple shapes, spines, and/or an HSL key

Every movie, TV show, and commercial you've ever seen has been corrected with those simple controls (often even fewer, since the popular Da Vinci systems do not offer spline masks). It's safe to say that the industry has decided that this configuration of color control is ideal for the task at hand. While each manufacturer's system has its nuances, unique features, and UI, they all agree on a basic toolset.

And yet there is no standardized way of communicating color grades between these systems.

This sucks, and we need someone to step in and make it not suck. Autodesk, Apple, Assimilate, Iridas; this means you. One of you needs to step up and publish an open standard for communicating and executing the type of color correction that is now standard on all motion media. This standard needs to come from an industry leader, someone with some weight in the industry and a healthy install base. And the others need to support this standard fully.

Currently the film industry is working in a quite stupid way when it comes to the color grading process, especially with regards to visual effects. An effects artist creates a shot, possibly with some rough idea of what color correction the director has in mind for a scene, but often with none. Then the shot is filmed out and approved. Only once it is approved is it then sent to the DI facility, where a colorist proceeds to grade it, possibly making it look entirely unlike anything the effects artist ever imagined.

Certainly it is the effects artist's job to deliver a robust shot that can withstand a variety of color corrections, but working blind like this doesn't benefit anyone. The artist may labor to create a subtle effect in the shadows only to have the final shot get crushed into such a contrasty look that the entire shadow range is now black.

But imagine if early DI work on the sequence had begun sometime during the effects work on this shot. As the DI progresses, a tiny little file containing the color grade for this shot could be published by the DI house. The effects artist would update to the latest grade and instantly see how the shot might look. As work-in-progress versions of the shot are sent from the effects house to the production for review, they would be reviewed under this current color correction. As the colorist responded to the new shots, the updated grade information would be re-published an immediately available to all parties.

Result? The effects artist is no longer working blind. The director and studio get to approve shots as they will actually look in the movie rather than in a vacuum. Everyone gets their work done faster and the results look better. All of this informed by a direct line of communication between the person who originally created the images (the cinematographer) and the person who masters them (the colorist).

Oh man, it would be so great.

I've worked on movies where the DI so radically altered the look of our effects work that I wound up flying to the DI house days before our deadline to scribble down notes about which aspects of which shots should be tweaked to survive the aggressive new look. I've worked on movies that have been graded three times—once as dailies were transfered for the edit, once in HD for a temp screening, and again for the final DI. Please trust me when I say that the current situation is broken. We need an industry leader to step in and save us from our own stupidity.

And this industry leader should do so with their kimono open wide. Opening up a standard will involve giving away some of your secret sauce. Maybe there's something about your system that you think is special, or proprietary. Some order of operations that you feel gives you an advantage. Well, you could "advantage" yourself right into obscurity if your competition beats you to the punch and creates an open standard that everyone else adopts. The company that creates the standard that gets adopted will have a huge commercial advantage. You can learn about the business advantages of "radical transparency" from much more qualified people than myself.

Of course, there will be challenges. Although each grading system has nearly identical features, they probably all implement them differently. It's not obvious how much information should be bundled with a grading metadata file. Should an input LUT be included? A preview LUT? Should transformations be included? Animated parameters? It will take some effort to figure all that out.

But the company that does it will have built the better mousetrap, and they'd better be prepared for the film industry to beat a path to their door. So who's it going to be?

Until you step up, we will keep trudging along, making movies incorrectly and growing prematurely gray because of color.

Wednesday
Aug152007

VFX: Easier than you think, harder than you think


I love this breakdown clip from Ryan vs. Dorkman 2 (which, if you haven't seen it, is totally worth watching). Based of the so-simple-it's-brilliant notion of showing Star Wars Lightsabers doing things that "we personally think would be fun to see," these guys staged a Lightsaber battle in a factory between, well, two regular guys. The effects work is excellent, and one reason why is that they shot a lot of practical elements.

When you're just getting into effects, it's easy to get stuck thinking that you have to do everything with your computer. These guys wanted to create a realistic reflection, smoke, and sparks. So you know what they did? They shot something that would create a reflection. Then they filmed some smoke. Then they filmed some sparks.

Easy, right? Well, maybe not. To some people it's easier to sit in front of a computer for hours trying to get particles to look like smoke than it is to black out a space and heat up a metal rod with a blowtorch. But the latter is worth the extra effort, because the results will look better and ultimately take less time to create. Sometimes making something look photo-real is just as easy—and as difficult—as shooting something real.

Sunday
Aug052007

Taming the Toy

I mentioned in my previous post that the Canon HV20 has poor manual control. While this has been amply documented elsewhere, here's a brief summary of why, in the form of a description of the Sex Positive workflow.

We used Cinema Mode. Consult your HV20 user's manual for the customary misunderstanding about camera modes with the term "cine" in their name: "Give your recordings a cinematic look by using the CINE MODE." No, that's not what it does. What it does is remove the clippy, contrasty gamma that makes consumer video look good to Ma and Pa Birthday Cam, and replace it with a clean, low-contrast curve that extracts as much dynamic range as possible from the CCD. CINE MODE is important with this camera—don't leave home without it.

We always used a 1/48th shutter. As readers of The Guide know well, a 180 degree shutter is not just a good idea, it's the law. Violate it and you've got audiences who've never heard of "Viper" or "Genesis" walking out of Apocalypto saying "What was with the scenes shot on video?"

We didn't have a ton of light. We had two 650W lights and one smaller one, and we blacked out windows to make a night interior out of a day shoot. The main light on the talent was a 650W diffused by a muslin. This meant that the HV20 needed to be "wide open" at 1/48th. Here begins the wrestling:

  • In CINE MODE, shooting 24p, aim the camera at something plenty bright, like the mus.
  • Use the joystick to enable manual exposure
  • Give the PHOTO Button a half-press to check the f-stop and shutter speed. You need a MicroSD card [EDIT: Actually it's a MiniSD card, thanks Mike!] in the camera for this to work, even though you don't intend to actually snap a still.
  • Adjust the exposure up until you see the magical combo of F2.4, 1/48. Best to overshoot to F2.8, 1/40 and then toggle one notch back. The HV20 can open up wider than 2.4 (to 1.8), but not when it's zooomed in. So the amount of zoom necessary to frame up the adaptor's groundglass is a factor in reducing light sensitivity.
You have to re-do that dance every time you camera auto powers-down, or every time you return from checking playback. It's not fun. I'd been getting myself used to it leading up to the shoot, and then had the alarmingly refreshing experience of dusting off the old DVX100a for the first day's shoot. On the DVX, if you want to change the shutter, you change the shutter. If you want to change the aperture, you change the aperture. And you can run with any gain you like at any of these settings. The HV20 will always open up more shutter before allowing the gain to increase, which makes sense for consumers but not for filmmakers. A day with the HV20 after a day with the DVX was a stark reminder of the filmmaker-friendly features we were giving up in order to go 1080p for less than a G.

We monitored in HD. Using a noga arm, I mounted an Ikan V8000HD to various places on the camera depending on our configuration. Mounting this LCD upside-down allowed me to see the image right-side-up, and since the Ikan is HD, I could actually see if my subject was in focus, which is a constant fight at f1.4! The Ikan runs on Sony camcorder batteries, but we just powered it with AC, mostly to keep weight down.

We shot to tape. In a tight apartment with a critical mass of gear, we shot to tape. You can eek an "uncompressed" signal out of this camera's HDMI output, but the last thing I wanted to do was drag a computer around with this rig. I've been experimenting with using Re:Vision Effects's new DE:Noise plug-in to reduce compression and noise in my HDV footage, and the results are very promising.

We boomed to a box. Rather than pipe the input from the boom mic into the HV20, we recorded to an M-Audio MicroTrack Recorder (another good choice would have been the Zoom H4). This tool audio level management off my plate (unlike the DVX100a, the HV20 has no convenient audio input level knobs) and ensured a high quality, interference-free signal. We slated manually and have the camera mic audio to help us post sync our dailies.

As you can see, there are trade-offs with this setup. One of the great things about basing a DV Rebel shoot around a prosumer camera such as the DVX100b, HVX200, or Canon XH A1 is that your rig grows with your capabilities and never gets in your way. It actually prepares you for a future of shooting with a Varicam or a Viper. With the HV20 on the other hand, you're off the reservation. You've go no leg to stand on when your camera fails to support your cinematic needs, because you bought it in the toy aisle. And yet, if you hop on one foot and wave the rubber chicken just right, you can make amazing images with the little guy.

Friday
Aug032007

Two Days, Two Rigs

Eric Escobar hit me up to DP his latest short, Sex Positive, last weekend. It was a two-day shoot, the first of which was a standard-def affair using my venerable DVX100a. The second day we shot with the prototype M2 rig seen here earlier.

Having the back-to-back experiences of using the DVX100a, with its ample manual control, familiar ergonomics, and dual XLR input; followed by the Canon HV20 with its barely-adequate controls but oh-so-lovely 1080p24 images, was a great education in what's terrific and what's still sorely needed in the DV Rebel's arsenal.

The M2 rig is experimental, so I won't review it except to say that both Eric and I would jump at the chance to use it again. He wants to shoot a feature on it, and I can't blame him. It worked well and we are channeling our feedback on its finer points directly to Redrock.

Enough typing—how about some pics? Here are two color corrected frames from day 2. In the second, that's day 1's footage playing on the TV in the background. Both of these frames happen to be made with that amazing Nikon 50mm f1.4 prime.

More later on the experience, the rig, the cameras, and our post path!