iPad for Filmmaking, Day Six Report

I’ve had my iPad for six days now, as anyone following me on Twitter knows. I realize that some of my Twitter followers find the iPad chatter to be a divergence from my usual filmmaking tweets seasoned with occasional missives about coffee and photography (both of which are, for me, filmmaking tools)—but that’s not the way I see it. My iPad has been quite busy over the first near-week of its life as a filmmaking tool.

First and foremost, I hoped that I would enjoy reading screenplays on my iPad, and I am happy to report that I do, very much. I read a ton of screenplays, many in PDF format. I hate reading them on my computer screen, especially my laptop. Not because of the backlit screen, but because of the psychological association I have with my computers. They are devices for doing work. They are a constant and cacophonous source of distraction. Reading screenplays, even well-written ones, is weirdly not easy. If you’re susceptible to distraction, reading a screenplay on a laptop can be like trying to count ceiling tiles at a Victoria’s Secret fashion show.

Printed screenplays are much better, but I hate wasting the paper myself. If they come to me printed, great—but even when printed double-sided (which welcomely is now the industry standard), they add up in meatspace. My 17” MacBook Pro, an extra battery, power adapter, and three screenplays crammed in a bag is a recipe for a very sore shoulder.

The minute the iPad apps started flooding the iTunes store, I began a search for a good PDF reading app. I flirted briefly with converting the screenplays to the ePub format used by Apple’s iBooks app, but with disastrous results. Here’s a cool article by someone more persistent that I—but while I certainly gave up in part due to laziness, it was also because I realized that ePub is not ideal for screenplays. ePub books can be re-flowed and re-paginated on the fly by the device, and that’s not a good thing for scripts, where white space, formating, and page numbers matter.

I didn’t just want to be able to read screenplays, I wanted to be able to make notes on them. There is a full-blown PDF annotation app called iAnnotate PDF, but I skipped it due to its complexity, and to be honest, because it is about the ugliest app I’ve seen yet on the iPad app store. I don’t need a ton of functionality, I just need to make little margin notes, like one can easily do in Apple’s under-appreciated Preview app on the OS X desktop.

Which sadly ruled out the simple, elegant, and bargain-priced (for now) GoodReader, which has a number of fans, including writer/director John August.

I found my sweet spot with ReaddleDocs. It is fairly priced at $4.99, and while not a standout in UI design (the icon is unfortunate, and the mechanics of organizing files are convoluted), it somehow has nailed exactly the amount of information I want on my screen when reading a script.

Some iPad periodicals have been criticized for failing to provide a sense of place within the larger document. Readdle is doing two things to subtly combat that here. Obviously the current page number and total page count are at the top of the document, but what I really love is the black dot on the right. When holding a printed screenplay, you always have an intuitive sense of how far through the document you are. The dot provides that perfectly. Wonderful.

Tap that dot and you can rapidly move to any page. The refresh rate is standard-issue iPad-awesome.

ReaddleDocs allows you to set as many bookmarks as you like, and name them. This is the capability that I have bastardized into a basic margin notes feature. Brevity is warranted, lest you type right off the edge of the screen (a forgivable bug for a day-one app). Another reason not to go too crazy with the bookmark/notes is that there is no way to export them.

The last thing I’ll say about Readdle is that, like GoodReader, it knows that the default iPhone OS PDF reading service is unsatisfactory, and replaces its scrolling model with a page-turning one. Here I have another minor complaint (which echoes August’s about GoodReader)—the page turning gesture in ReaddleDocs is too stubborn, and the redraw is not as slick as the rest of the app. Again, I forgive this as a version-one issue that would be hard to test for without an actual device in hand. I don’t expect (or want) fancy iBooks-like page flipping animation, just something simple and smooth (and left-to-right) like what’s in the excellent Amazon Kindle app.

Readdle and GoodReader can both grab your PDFs from the web, Dropbox, email accounts, and computers on a shared Wi-Fi network. There is a seemingly never-ending flow of classic screenplays available at mypdfscripts.com.

So that’s reading screenplays—how about writing them? Final Draft is working on something for the iPad, as are the developers of iPhone screenwriting apps Screenplay and ScriptWrite. Until those options materialize though, the clever duo of Joke and Biagio have created a template for Apple’s Pages app that achieves screenplay formatting via Styles, which allow some automation (hitting Return after a character name will take you to a dialog element automatically), but not much (no Tab to advance through elements).

Adobe has a cloud-based, colaborative screenwriting web app called Adobe Story, currently categorized as a “free preview version” at Adobe Labs. Who would have thought that Adobe would provide the Google Docs of screenplays? There’s even a standalone AIR app. If ever there was a screenwriting app that wanted to be on the iPad, its Adobe Story. And with Adobe running AIR apps on iPads on day one, maybe there’s hope.

If you plan on writing anything long on the iPad, you may want to consider a physical keyboard. I like the Apple Bluetooth Keyboard because it allows flexibility in how you position and orient the device, and because Bluetooth was named after a Viking.

There are several movies that provide endless sources of inspiration to me, and since I own them all on DVD, I have no compunctions at all about ripping them with Handbrake and storing them on the iPad. Sex them up with cover art from this search engine (in iTunes, File > Get Info, Artwork tab, Add).

I used Apple Compressor to make iPad-friendly version of my demo reel, my short films, and various other inspirational videos found around the web.

I’m using the new Publish functionality in Lightroom 3 Public Beta 2 to fill my iPad with portfolio images, along with color reference stills, reference images for projects in development, and the usual family photos. Since I don’t use iPhoto, I just tell iTunes to sync my iPad with a specific folder I’ve created. Sub-folders become iPad “albums.”

So I have a dozen screenplays, a half-dozen feature films (with commentary tracks), my entire photography portfolio, and the ability to watch anything Netflix streams, all tucked neatly in my new murse. Not bad for less then a week into things. I bought the iPad with specific (and, so far, not very adventurous) ideas about how it could instantly become a useful filmmaking tool, and so far it has met and exceeded my expectations.

Free iPad Wallpapers

I know its April first, but I assure you, I am deadly serious about this: Free ProLost wallpapers for your new iPad. Right-click to download the 1024x1024 originals one at a time, or download them all in a zip archive.

They’re square so they can work in both portrait and landscape modes—the iPad crops them on the fly. Some are definitely on the busy size, but you can set a different image as your lock screen, so maybe they have a place there.

All images were processed in the new Lightroom 3 Public Beta 2.

Since I don’t have an iPad yet, I don’t know which of these will work the best—please let me know in the comments which work out well for you this weekend!

These images are copyrighted, but free for you to use as wallpaper and lock screen images on your personal iPad. If you want to share them, please link to this page.

Magic Bullet PhotoLooks

Yesterday Red Giant Software announced the release of Magic Bullet PhotoLooks. It’s the same Magic Bullet Looks you know and love, re-engineered for use on high resolution stills in Adobe Photoshop.

In case you don’t know, Looks, and now PhotoLooks, is a creative toolset for giving your images an overall cinematic look. It’s based on the model of an actual camera, with filters, lens characteristics, and film processing tricks. By accurately simulating the physics of light, glass, and celluloid, it creates a fun, creative environment for experimenting with your shots. Start with one of 100 presets, see how they’re put together, then modify them to taste—or design your own and share them with friends.

Longtime Magic Bullet Looks users will recognize the interface, presets, and tools—so much so that they might even wonder what’s new about this new version. A lot has changed under the hood, but all in ways designed not to be noticed. Here are some examples:

  • That PhotoLooks is a native Photoshop plug-in means that not only can you use it directly from within Photoshop, but you can also use Photoshop’s Smart Layers to keep PhotoLooks as a non-destructive adjustment that you can tweak again and again, even after closing and re-opening the file. Aharon Rabinowitz shows you how to do this in the above tutorial.
  • PhotoLooks contains the beginnings of a Color Management solution, so that your color-managed Photoshop workflow will match what you see in the PhotoLooks UI. Future versions will refine and enhance this feature to work with any popular color space you might care to use for your photography workflow.
  • The last one is the biggest change and hopefully the most invisible: The Looks rendering engine has been re-written completely to work on high-resolution stills. While working on your look, you get the fluid, GPU-accelerated experience Looks has always provided, but when you press OK, your look is rendered by the new CPU render engine that can handle the gigantic image sizes common to current-generation cameras. If you’ve used the “secret” stills feature of Magic Bullet Looks, you may have run up against limitations in resolution. That won’t happen with PhotoLooks.

PhotoLooks is $199, or $99 if you already have Magic Bullet Looks or Quick Looks.

What’s fun for me, as the guy who designed it, is to see a whole new legion of creative professionals exposed to the power and creativity of Magic Bullet Looks. Here are some of their impressions:

I am not exaggerating when I say that Magic Bullet PhotoLooks will re-invent the way people think about filters in Photoshop—I have never seen anything like it.

-Deke McClelland, award-winning Photoshop author, and trainer

Another favorite feature of mine is the Look Theater. I get creatively stumped with my photography occasionally, and it is so cool to be able to just sit and watch my photographs take on a new persona without me having to lift a finger.

-Justin Seeley, Photoshop trainer and graphic designer

Magic Bullet PhotoLooks is a fantastic tool, with absolutely no adoption curve.

-Thorsten Meyer, Photographer

To make a perfect look for a photo [using Photoshop’s built-in tools] can be an arduous process of changing levels, curves, diffusion, glows, spot exposure, color correction, vignetting, edge softness, etc. However, the thumbnail for each of the 100+ presets in Magic Bullet PhotoLooks instantly updates to show its effect on your photo making it really easy to compare the effect of each one.

-Jack Tunnicliffe, Java Post Production

You can read more testimonials here.

Converting 30p to 24p

As the long-awaited 24p firmware update for the Canon 5D Mark II draws near, I joined Mike Seymour on episode 57 of the Red Centre podcast to talk about how excited I am that it marks the end of painful workarounds for the 5D’s no-man’s-land frame rate of 30.0 frames per second.

For as long as I’ve had my 5D Mark II, I’ve avoided using it for any projects that I could not shoot 30-for-24, i.e. slowing down the footage to 23.976 fps, using every frame. My 5D has been a gentle-overcrank-only camera. There are plenty of occasions to shoot 30 frames for 24 frame playback — we do it all the time in commercials to give things a little “float,” or to “take the edge off” some motion. I still do this often with my 7D. Whatever frame rate I shoot — 24, 30, 50 or 60 — I play it back at 24. Just like film.

Folks often ask me about 30p conversions. Twixtor from RE:Vision Effects is a popular tool for this, as is Apple’s Compressor. Adobe After Effects has The Foundry’s well-regarded Kronos retiming technology built-in. All of these solutions are variations on optical flow algorithms, which track areas within the frame, try to identify segments of the image that are traveling discretely (you and I would call these “objects”), and interpolate new frames based on estimating the motion that happened between the existing ones.

This sounds impressive, and it is. Both The Foundry and RE:Vision Effects deservedly won Technical Achievement Academy Awards for their efforts in this area in 2007. And yet, as Mike and I discuss, this science is imperfect.

In August of 2009 I wrote:

I’m not saying that you won’t occasionally see results from 30-to-24p conversions that look good. The technology is amazing. But while it can work often, it will fail often. And that’s not a workflow. It’s finger-crossing.

On a more subtle note, I don’t think it’s acceptable that every frame of a film should be a computer’s best guess. The magic of filmmaking comes in part from capturing and revealing a narrow, selective slice of something resonant that happened in front of the lens. When you use these motion-interpolated frame rate conversions, you invite a clever computer algorithm to replace your artfully crafted sliver of reality with a best-guess. This artificiality accumulates to create a feeling of unphotographic plasticness.

Of course, it’s often much worse than a subtle sense that something’s not right. Quite often, stuff happens in between frames that no algorithm could ever guess. Here’s a sequence of consecutive 30p frames:

Nothing fancy, just a guy running up some stairs. But his hand is moving fast enough that it looks quite different from one frame to the next.

Here’s that same motion, converted to 24p using The Foundry’s Kronos:

Blech.

Again, don’t get me wrong—these technologies are great, and can be extremely useful (seriously, how amazing is it that the rest of the frame looks as good as it does?). But they work best with a lot of hand-holding and artistry, rather than as unattended conversion processes.

(And they can take their sweet time to render too.)

I’m so glad we’re getting the real thing.