Simple, elegant screenwriting.

  • Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
  • TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM DR-100mkII 2-Channel Portable Digital Recorder
  • The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    by Stu Maschwitz

Entries in Visual Effects (83)


Real Men Comp With Film

Be careful having dinner with Mike Seymour.

He was sharing with me his nerd-joy over being able to interview Jon Alexander at ILM about the history of optical compositing. I offhandedly mentioned that I had once, out of pure lifelong curiosity, re-created the optical bluescreen extraction workflow using After Effects.

Oops. The next day Mike was in my office with a camera. Watch this whole video. My bit is nerdelicious, Jon’s is wonderflully insightful and grounding, and it all adds up to a great taste of what fxphd members get to enjoy heaping tons of.

Read the companion article here.


Realistic Lens Flares

When technology and art intersect, its often the case that those who know how have no idea why, and those who know why have no idea how.

I can’t stop thinking about this SIGGRAPH video. It shows realistic lens flares being computed in real time using a ray tracing technique. There are some lens artifacts shown here of a complexity and beauty that I’ve never seen faked convincingly before.

Lens flare is caused by light passing through a photographic lens system in an unintended way. Often considered a degrading artifact, it has become a crucial component for realistic imagery and an artistic means that can even lead to an increased perceived brightness.

Jesus nerds. “Increased perceived brightness?” That was the best sales pitch you could give on why being able to synthesise realistic lens flares is worthwhile?

Lens flares are awesome because they are fricking crazy. They are completely unreal. They increase the veil of unreality between the audience and the movie. They are beautiful. They are tiny imperfections magnified by orders of magnitude. They are aliens. And scary buildings. We give them sound effects and music cues. They make movies bigger than life because they have nothing to do with life.

And I want yours.

In the Vimeo comments the poster said:

Anamorphic optics are currently not supported, but this is not a principal limitation of the rendering scheme.

If they had put anamorphic examples in this video I think I’d be standing on their lawn with a boom box right now.

Physically-Based Real-Time Lens Flare Rendering — Hullin, Eisemann, Seidel, Lee


Use Dropbox to Remotely Monitor After Effects Renders

Ever since I’ve had a computer, I’ve had long render times. Whether it was ray-traced checkerboard spheres on my Amiga 1000 or The Last Birthday Card on my blue G3 tower, I’ve always managed to find ways to keep my computer busy while I’m off pursuing other hobbies, such as sleeping, long walks on the beach, or (most likely) staring at the screen chanting “faster, faster!”

On those rare occasions that I decide to leave the computer alone with its thoughts, I sometimes wish I had a way to check in on the render progress from afar. Adobe After Effects ships with a handy script called “Render and Email” that can send you a simple email to announce the completion of a render. If you have push email on your phone, or know how to send emails that arrive as text messages (here’s how), this can be a suave way to leave your render cooking with the confidence that you’ll know precisely when to return from your three martini lunch.

But that’s not quite the same as an actual visual confirmation of a successful render. In a world of iPhones, augmented reality, and non-fat yogurt that actually tastes good, we deserve more.

I recently figured out a couple of nifty ways to get remote, visual updates on my epic After Effects renders, thanks to the insanely useful and free service known as Dropbox, AKA What Apple’s iDisk Should Have Been. Dropbox is a directory on your hard drive that is constantly syncing in the background to a remote server. You can share subfolders with specific people or groups of people (whether they be on Mac, Windows, or Linux), and these folders truly are shared in the sense that anyone to whom you grant access can add, remove, or edit files therein. I use it to collaborate with other writers, with my post-production crews, and even to remotely add photos to the screen saver loop on my parent’s iMac.

Did I mention that all of this is free, for up to 2GB of storage?

Dropbox also offers a free iPhone app [iTunes link] that allows browsing your Dropbox folders and limited file viewing. Two of the file types that can be viewed on the iPhone screen are JPEG and Quicktime.

You can set up After Effects to render to your Dropbox, and view the results on your iPhone.

Of course, it’s not exactly that simple. There’s a limit to the size of file that can be viewed on the iPhone, and you wouldn’t want to be pulling 2K DPX files across AT&T’s network even if you could do something with them once you got them. So there are a couple of things you can do to streamline the process. Unfortunately it’s a bit of work to set up.

The simplest thing to do is to configure your Render Queue item to have two Output Modules: the one you were planning on rendering anyway, and a second one set up as a JPEG sequence with the “Stretch” option enabled to scale the images down to an iPhone-friendly size. It’s this second Output Module that you’ll render to your Dropbox folder. Every time a frame completes, an iPhone-optimized JPEG of it will be automatically uploaded to your secure Dropbox storage.

The result is that every time you open the Dropbox app on your iPhone, you not only see how many frames have been rendered, but you can visually flip through the frames themselves. Sweet!

Of course, what you can’t do is view the animation at speed, so that’s where the second option comes into play. You can create a third Output Module that writes out a small (not more than 480 pixels wide or 360 pixels tall) H.264 Quicktime movie.

Now you can both check your frames as they finish, and watch the end result at speed.

If you configure that Render and Email script and use it to launch your render, you’ll also have a push notification that the render is complete.

It’s not quite the same thing as full administering your render from your phone, but it’s still pretty cool.


Passing the Linear Torch

I used to show you weird crap like this all the time

Back in the day I blogged a lot about how compositing and rendering computer graphics in “linear light.” a color space in which pixel values equate to light intensities, can produce more realistic results, cure some artifacts, and eliminate the need for clever hacks to emulate natural phenomena. Along with Brendan Bolles, who worked with me at The Orphanage at the time, I created eLin, a system of plug-ins that allowed linear-light compositing in Adobe After Effects 6 (at the mild expense of your sanity). I also created macros for using the eLin color model in Shake and Fusion. Along the way I evangelized an end to the use of the term linear to describe images with a baked-in gamma correction.

Then Adobe released After Effects 7.0, which for the first time featured a 32-bit floating point mode, along with the beginnings of ICC color management, which could be used to semi-automate a linear-light workflow. The process was not exactly self-explanatory though, so I wrote a series of articles (1, 2, 3, 4, 5, 6) on how one might go about it.

Then I rambled endlessly on fxguide about this, and in the processes managed to cast a geek spell on Mike Seymour and John Montgomery, who republished my articles on their fine site with my blessing.

This week Mike interviewed Håkan “MasterZap” Andersson of Mental Images about the state of linear workflows today on that same fxguide podcast.

Which is so very awesome, because I don’t want to talk about it anymore.

It’s just no fun going around telling people “Oh, so you put one layer over another in After Effects? Yeah, you’re doing it wrong.” Or “Oh, you launched your 3D application and rendered a teapot? Yeah, you’re totally doing it wrong.”

You are doing it wrong. And I spent a good few years trying to explain why. But now I don’t have to, because Mike and MasterZap had a great conversation about it, and nailed it, and despite the nice things they said about ProLost you should listen to their chat instead of reading my crusty old posts on the subject.

Because it has gotten much, much simpler since then.

For example, there’s Nuke. Nuke makes it hard to do anything but work in linear color space. Brings a tear to my eye.

And the color management stuff in After Effects has matured to the point that its nearly usable by mortal men.

Since I’ve seen a lot of recent traffic on those crusty old posts, here’s my linear-light swan song: a super brief update on how to do it in AE CS4:

In your Project Settings, select 32 bpc mode, choose sRGB as your color space, and check the Linearize Working Space option:

When importing video footage, use the Color Management tab in Interpret Footage > Main to assign a color profile of sRGB to your footage:

Composite your brains out (not pictured).

When rendering to a video-space file, use the Color Management tab in your Output Module Settings to convert to sRGB on render:

That’s the how. For the why, well, those crusty old articles could possibly help with that, especially this one on color correction in linear float, and this one on when not to use linear color space. Part 6 is still pretty much accurate in describing how to extract a linear HDR image from a single raw file using Adobe Camera Raw importer in After Effects, and this article covers the basics fairly well, although you should ignore all the specifics about the Cineon Emulation mode, which never should have existed. This little bit of evangelism is still a good read.

But the ultimate why answer is, and has been for a while now, within the pages of a book you should have anyway: Adobe After Effects CS4 Visual Effects and Compositing Studio Techniques (deep breath). Brendan Bolles guest-authored a chapter on linear light workflow, and not only does he explain it well, he gives many visual examples. And unlike me, Mark keeps his book up-to-date, so Brendan’s evergreen concepts are linked directly to the recent innovations in After Effects’s color manglement.

OK, that’s it. Let us never speak of this again.