Fireballed

TANK

TANK is an animated short film that I have been working on for a year and half. It's up today on Red Giant's site.

The Making of Tank

The way I made TANK is a little crazy. I made it entirely in Adobe After Effects, with equal parts animation elbow grease and nerdy expressions madness. This video is part behind-the-scenes, part After Effects tutorial, and part therapy session.

A complete list of all the tools I used to make TANK is available at Red Giant's blog:

VectorKit

Want to try making 3D vector graphics in After Effects? No? Well, if you change your mind, I packaged up the basic working of my TANK vector graphics rig into an After Effects template that you can download for free from the Prolost Store.

Visual Effects are Not the Answer

From the time I saw Star Wars, almost all of my favorite movies have been full of visual effects. My love of film and of visual effects developed simultaneously. I ultimately worked at Industrial Light & Magic for four years, even putting in some time on Star Wars itself, for better or worse. After that, I co-owned a visual effects company for ten years.

Visual effects can contribute enormously to a film. Few of us will ever forget the AT-AT attack on the rebel stronghold in The Empire Strikes Back, or the T-rex stepping out from the paddock in Jurassic Park.

But while visual effects can lend support to a movie, they are incapable of holding up its full weight on their own. I bet you can think of a few recent films that effectively demonstrate this.

With very few exceptions, visual effects cannot contribute something to a movie that isn’t there already.

They must augment and support the fundamental building blocks of film: story, performance, kinetic mise-en-scène, and even old-fashioned visual trickery.

Here are some examples of visual effects failing to solve filmmaking problems:

  • There’s no bad wire-fu that can be saved by painting out the wires. The wires are not the problem.
  • There’s no morph that tells a story better than a simple cross-dissolve. If it doesn’t work as a cross-dissolve, it won’t work as a morph.
  • I watched a team of incredibly talented compositors work for weeks to “improve” on a hacky Avid speed-change effect in Star Wars: Episode I - The Phantom Menace. Ultimately, all the CGI and compositing was shelved in favor of a frame-by-frame copy of the Avid frame-blend effect.
  • In Jaws, a malfunctioning effect forced Spielberg to replace many planned shark appearances with clever filmmaking, resulting in one of the greatest and most influential movies of all time.

And today's example:

  • Your talking animal movie will not be any funnier with computer-generated mouths.

We’re back to the trailer embedded at the top of this post. Maybe you think it’s funny, maybe you don’t. But what I love about it is that someone finally realized that this kind of movie would be not one tenth of a percent better with animated cat mouths.

Lightroom for Your Camera

The Most Important New Photo App Has a Fatal Flaw

Adobe launched Creative Cloud 2014 today, and along with it several new mobile apps. Photoshop Mix lets you blend layers using a touch-optimized version of Photoshop’s powerful Quick Select tool. Line and Sketch are drawing apps designed to work with Adobe’s own stylus and ruler accessories.

But the one I’m most excited about wasn’t a surprise at all. As promised, Lightroom mobile is now available for iPhone. Like the iPad version, it’s free, but requires a Creative Cloud subscription. (The bundle of Lightroom and Photoshop for $9.99/month is now a permanent pricing plan, by the way.)

Despite having the exact same features, Lightroom for iPhone is a very different thing than the iPad version. Because this is Lightroom running on your camera.

I’m a “serious” photographer. I have cameras with red dots and and lenses with red rings. But I also take a ton of photos with my telephone. Having the power of Lightroom running on your actual camera is a major, important change to mobile photography. When you snap a shot, or, more likely, a series of shots on your iPhone, and then easily (even automatically) upload them to your Lightroom catalog, where you can then edit, flag, and now rate them, with all changes synced to your master catalog, you have a speed and power in mobile photography that will have you rethinking your iPhone’s role as a “casual” camera.

A New and Promising Workflow

Effortlessly getting your fresh iPhone snaps into Lightroom is great in a few ways. I started out wishing for nothing more than metadata management in a mobile Lightroom app, and I have productively used that functionality, but now I’m hooked on having the power of Lightroom’s editing controls in my pocket. Lightroom’s exposure, contrast, color temperature, clarity and shadow/highlights controls produce significantly better results than any other mobile photo editing app.

But the real power comes when you launch Lightroom desktop, and see all of your photos there, with their edits as nondestructive metadata. You can continue making your photos look their best, and the edits will be synced back to your iPhone—even those that Lightroom mobile can’t modify. This means Lightroom mobile is rendering the full Adobe Camera Raw engine, which bodes well for increased editing capability in the future.

iPhone photos thoughtfully processed in Lightroom can look shockingly good. Here are a few examples from my recent trip to Taiwan.

If you choose to automatically upload all your iPhone photos to Lightroom (the aggressive default when you first run the app), you could even dispense with syncing your photos to your computer the old-fashioned way (if it weren’t for pesky video).

The Fatal Flaw

Lightroom mobile is a work in progress, and it’s not perfect. I’d kill for user-created presets synced from desktop Lightroom. You don’t have control over where on your computer your uploaded photos are synced to. And when sorting through photos, you have to switch between flagging mode or star rating mode (new in version 1.1 for iPad as well), rather than having both available at once. You can’t even see both flags and stars at the same time, even though there’s plenty of space on the screen.

But the biggest flaw represents a fundamental misunderstanding of mobile photography. Lightroom mobile strips important metadata from your photos, including time/date and location. That’s right, Lightroom mobile kills one of your iPhone’s best camera features—the always-on GPS.

This means that if you edit a photo in Lightroom for iPhone, save it back to your Camera Roll, and then share it, the social media service you share to won’t know when or where the photo was shot. Apps like Facebook and Instagram use this info to make sharing better. If you’re more privacy-minded like me, maybe you use a personal diary app like Day One. Day One uses photo metadata to automatically create a journal entry with the correct date stamp and GPS location. But if you try this which a shot saved from Lightroom mobile, no such information is found, and the journal entry is created using the current time and location.

Every 99¢ (or free) photo app gets this right. That Lightroom doesn’t is an embarrassing omission.

Just The Beginning

I have high hopes that Adobe will address these shortcomings. We’re only at version 1.1 of Lightroom mobile. It’s almost my go-to mobile photography app (competing with Mattebox, an awesome app that offers custom filter building and sharing, and that leaves my metadata alone).

With proper metadata handling, user presets, and the ability to customize where synced shots are stored, Lightroom mobile could become a must-have for anyone who uses their telephone as a camera, which is approximately everybody in the world.

Lightroom mobile is available on the iTunes App Store for iPhone and iPad. It requires Creative Cloud, which is $9.99/month for Lightroom and Photoshop. Or get a year of full Creative Cloud membership for $50 off from B&H until June 20.