Jeffery Harrell

You can email me.

Apr 5

Yet another DSLR workflow

It’s 2012 — though I’m still writing 1997 on all my checks — and it seems like everybody and his sister has a DSLR workflow. Well, here’s mine, inspired by a recent DSLR-shot short film I finished and a lot of twiddling and tweaking.

Here’s the list of ingredients, so you can go shopping and follow along at home:

The basic workflow, though, should be pretty easily adapted to other tools. You’ll just need to play with it.

I’m gonna break this down into sections and go into some detail. Feel free to skim. I know I would. But lemme start with kind of a 10,000-foot view of the whole process, just so you know where we’re heading.

Overview

The basic idea here is to pretend we’re talking about film. You remember film? Film workflows are both ancient and venerable, because my God, they have to be. Film’s expensive to deal with, and a feature will tend to generate miles and miles of it, so staying organized and being consistent isn’t really optional. Rather than being all rogue about things, we’re just going to pretend we understand film workflows very well, and take the central ideas of film post and adapt them for our purposes.

That means we’re doing an offline-online workflow, for starters. For those of you who might not know the jargon, an offline-online workflow is one in which you do a quick-and-dirty — and above all, cheap — transfer of all your raw footage to a highly compressed format that your computer can throw around really easily, do all your creative editing on that compressed footage, then output some kind of machine-readable timeline file that goes through a process called conforming. Conforming is where you take the shots you chose in your offline edit — and only those shots — and create a new timeline with them that matches your offline frame-for-frame. It’s this new timeline, with all the high-resolution media in it, that you use to do your visual effects (if any) and color correction.

Now why, you might be asking yourself, would you ever choose to do such a bizarre thing as that. Why not just work exclusively with your high-resolution media from the get go, so you can be done as soon as your edit is finished?

Well, there are a couple reasons why you wouldn’t want to do that. First, film transfers at full resolution are expensive, so you only want to transfer just the frames you absolutely need and not any others. Second, film transfers at full resolution generate a lot of data — anywhere from six to twelve megabytes per frame — and even a beefy computer struggles to move that data around at anything like real time. So doing a quick and cheap transfer of everything to a highly compressed format means you can mind your nickels and dimes, but it also means you can work more creatively when you’re editing.

Now, of course we aren’t talking about film here, but those two principles still apply: it’s really hard to work creatively with the H.264-compressed media files DSLRs spit out, because they’re just so heavily compressed. Yes, the files are small, but your computer has to do a lot of math to deal with them, so things get boggy and slow down on you (and that’s if you’re lucky enough not to just outright crash your NLE in the process). But converting them all to a high-resolution online format is really expensive — in time. You might shoot an hour of rushes for a ten-minute short film — and that’s if you’re a really good director who nails everything in the first few takes. If you’re not so lucky, you might end up with ten hours of rushes for a ten-minute short … or even more. And batch-transcoding all that to your finishing format will take way longer than you want to invest in sitting around twiddling your thumbs.

So we go back to the old ways: an offline-online workflow essentially the same as what you’d use on a film project. Batch-transfer everything to a crappy, low-resolution but lightweight offline format as fast as possible, then edit, then do the time- and hard-drive-consuming high-resolution, full-quality transfer later.

Our workflow’s basically gonna look like this:

  • Prep all your media in sensible, sane ways
  • Batch-transcode to an offline format that’s suitable for editing
  • Edit using the offline media until the picture’s locked
  • Conform your timeline to the original media files
  • Render out ProRes versions of your selects (with handles) for onlining
  • Bring the whole shebang into After Effects for VFX, color grading, the adding of titles and whatever else needs to be done at full resolution
  • Output a fuller-resolution master file that you can marry up with the mastered audio files

So that’s the big picture. How let’s dive into the details, starting with the part that has to happen before step one.

Preflight

If you’re out there making your own movie with a Canon 7D or whatever, then for God’s sake get organized on set. Pretend you’re getting paid and run your set like the studio’s gonna audit you at any minute. Adopt some basic DIT workflow practices, like naming conventions for mags and clips and such, and follow them assiduously. It’ll make your life easier in the long run.

But of course, you’re probably not out there making your own movie. You’re probably sitting there with a stack of hard drives with God-knows-what on them, and maybe, if you’re incredibly lucky, a shooting script. Maybe. (And even if you have one, it’s probably on white pages anyway, meaning it’s never been revised since before shooting began. Good luck finding any correspondence at all between the script and the rushes on those hard drives in that case.)

If that’s how it is — you’re coming in after principal photography has wrapped and you’re kind of thrown to the wolves — then for God’s sake, get things organized before you begin. I know, I know, editing is the fun part, and it’s natural to want to rush into it. But seriously, you’re just going to create headaches for yourself if you don’t impose some kind of order on the chaos in which you find yourself.

Bin and clip naming conventions

There are as many different schools of thought about bin and clip naming as there are editors in the world. If you’ve already got ideas on this topic and they work for you, great. Skip ahead to the next section. But since I’m typing anyway, I’ll go ahead and put my ideas down here.

I use two different systems for bin and clip organization, depending on what the externalities are on any given job. Either I use a camera-oriented system, or I use a script-oriented system.

Camera-oriented bin and clip naming

My camera-oriented system looks like this: Every magazine that comes out of the camera — read “CF card” here, since we’re talking about DSLRs — gets a name that indicates which camera it came from and which magazine it is. Since you’re unlikely to have a thousand different camera mags on any one DSLR project, I stick with this:

XNNN_YYMMDD

Here the X stands for the camera identifier: A for the first (if multi-camera) or only (if single-camera), B for the second, and so on. The NNN is a three-digit number that starts at 001 and goes up. So the first mag for the A camera would be A001, the ninth mag off the C camera would be C009, and so on.

The YYMMDD part should be obvious: two-digit year, two-digit month, two-digit day on which that magazine was recorded. So if the first day of photography was March 13, 2011, the first mag would be:

A001_110313

Now, some people like to go even further with it. If you deal with more than one project that’s shooting at the same time, you might want to stick a couple more digits on the back of that to distinguish between the A camera on this project and the A camera on that project. But I’ve never needed to do that, personally, so I don’t bother.

So say you’re working on a single-camera short film that shot over the course of two weekends — July 16-17 and 23-24 of 2011 — shooting one mag per day (because CF cards hold like a million hours of footage or something). Make the following folders:

A001_110716
A002_110717
A003_110723
A004_110724

(Why do it like this instead of just A001, A002, etc.? Cause you will have multiple A001 folders in your life — one for each project you do — and it’s always better to keep them separate, just to avoid future confusion.)

Now that you’ve got your mag folders set up, it’s time to start populating them with clips. Each clip shot on a DSLR comes out with a name that looks like this:

MVI_2317.MOV

That sucks, because it doesn’t tell you anything at all. It’s just a three-letter prefix that’s always the same, plus a number the camera pulls out of its ear. So that won’t do at all.

What we’re gonna do is rename these files, giving them unique clip IDs. It’s not a complicated scheme; for each mag, we’re just going to start with C001 and count up. But here’s where it gets fun: We’re going to include the mag information in the clip names. Like so:

A001_C001_110716.MOV

That’s camera A, mag 001, clip 001, on July 16, 2011. As distinct from:

A003_C001_110723.MOV

which is camera A, mag 003, clip 001, on July 23, 2011. Totally different shots, see.

Now, if this sounds like a giant pain in the butt — having to manually rename each and every clip one by one — well, it is. Believe me, it is. But there are some shortcuts that can help. I like using a little utility called A Better Finder Rename, which lets you do nice things like apply regular expressions and stuff. I’ve got a whole workflow cooked up. You can roll your own or whatever you like.

(Incidentally, if something tickles your hindbrain about this little naming scheme, it’s probably the fact that I stole it pretty shamelessly from the emerging standard in the industry for identifying tapeless media. Red and Arri both use essentially this same standard.)

But all that presupposes that what you get is a collection of folders, one for each CF card recorded during the shoot, and you have only basic information about each clip. It’s a good organizational scheme … but it’s not the best organizational scheme for editing. The best scheme for editing is one that’s oriented toward the script rather than the camera.

Script-oriented bin and clip naming

Every take in a production can be uniquely identified by a small collection of letters and numbers: the camera (or angle), the scene, the shot (or setup), and the take.

The camera a particular take was filmed on is identified just as above: A, B, whatever. The scenes are numbered in the shooting script — possibly discontinuously, as scenes are often omitted between writing and shooting.

Each scene is broken down into shots — we’re going to shoot these lines in close-up, then these lines in a wide shot, then get a medium of all the lines together for coverage, et cetera. These are often noted with letters: 13A is the first shot of scene 13 (say it’s a wide), 13B is the second shot of that scene (a close-up), and 13CD is a shot that starts out in close up (shot C) but then dollies out to a medium (shot D). That kind of thing.

And then, of course, you do multiple takes of each setup of each scene. Takes are numbered, but you knew that already.

So that means you can identify each individual clip of a project this way:

13A_2_A

That’s scene 13, shot A, take 2 on the A camera. (Why do we put the camera ID last instead of first like before? Bin sorting. It’s better if 13A_2_A and 13A_2_B are right next to each other in your bin, rather than finding A_13A_2 and then having to scroll down for a month to find B_13A_2.)

Whether you use a camera-oriented scheme or a bin-oriented scheme depends on what kind of project you’re doing and what you’re given. For instance, documentary projects are rarely thought of in terms of scenes and takes during production; in that case, it makes more sense to just think about cameras and mags and clips, ‘cause that’s information you have available to you. On the other hand, a scripted drama project might lend itself more readily to organizing by scene, shot and take. Which one you use is entirely up to you. The point here is for the love of all that’s good and holy, pick one!

Don’t forget sound

DSLRs are capable of recording sound right there in the camera body itself, in sync with picture. But you shouldn’t ever do that, because you get crappy, crappy sound that way. Instead, you want to record dual-system sound, using some kind of external sound recorder that takes input from good mics.

What you get out of such a recorder is most likely going to be 24-bit, 48 KHz WAV files. These files named no more usefully than the files DSLR camera bodies spit out:

080522-002.WAV

Maybe that’s a date, I dunno. It looks like it could be: May 22, 2008? Who knows. That’s the name of one file I have here in my project archive, and your guess is as good as mine as to what it signifies. (The fact that that particular audio recorder had not yet been invented in 2008 argues against its being a date, but whatever.)

Point is, you need to get your sound files organized just like you organize your picture files. Whatever naming scheme you use to organize your clips, apply it to your sound files. This is where script-oriented naming works well, cause you can just have these files:

16DE_2_A.MOV
16DE_2_B.MOV
16DE_2.WAV

Scene 16, shot DE, take 2, A-camera, B-camera and sound. That’s all your coverage for that particular take.

Anyway, again, the point is you gotta pick a system and stick with it.

Transfer for offline (or “what’s a one-light?”)

If we were talking about film here — cause remember, that’s where we find our inspiration — it’d be time to discuss the mystical and magical process of one-light film transfer.

The short version is this: When you transfer film to video or a data file format, you have to set the exposure on the film scanner. Since a roll of film has many takes on it, and in fact may include takes from different scenes shot at different locations at different times of day, the guy who does your transfer really ought to set the exposure differently for each scene, to compensate for underexposure or overexposure or whatever. Trouble is, that’s work, and work costs money. So when you’re getting your film rushes scanned for your offline, you typically ask for a one-light transfer, which means the colorist sets one light — that is, sets the exposure just once — and runs the whole reel. You get back dailies that are mostly-okay, generally, but more important you got ‘em back cheap.

Converting DSLR media files — you know, those H.264 QuickTimes — to a finishing format is a lot less complicated than scanning film. But it still takes time, and we want to minimize the time we put into it, both in terms of time we ourselves spend and in terms of time we have to dedicate our computers to the transcoding process.

My personal workflow for handling this involves two steps: Adding timecode to the camera media, and batch-transcoding the media to DNxHD.

The timecode thing I just said

Timecode is more important than you might realize. It’s a good idea, just in general, for each frame in your project to have unique timecode. I elaborate a bit on why this is true here, but in addition to the basic principles involved, you should be aware that the conforming process I’m going to describe here will literally break if you don’t have timecode. So just drink the timecode Kool-Aid already, okay?

There are a couple different ways to put timecode on DSLR media; you can use a utility called QtChange — google it — which adds timecode to your media files directly, or you can use Magic Bullet Grinder which copies your media files and then adds timecode to them. I go back and forth, but right now, as I write this, I think I prefer Grinder. Both tools are useful, but Grinder does more things, and like Alton Brown, I’m disinclined toward unitaskers by nature.

So here’s what you do: Organize your clips into folders by reels. “What’s a reel?” you ask? It’s a collection of clips that we’re going to put in order by timecode. We’re going to have a set of clips, for example, that all have timecode in the one-hour range, and then another set in the two-hour range, and so on. You can do this by mag if you like, or you can do it by scene; whenever possible, I choose to do it by scene, because of all the reasons I talked about in the section on bin and clip organization.

Anyway, once you get all your clips divided into folders, set Grinder up this way:

Grinder

Things to note: The timecode start option is set to “Continuous” (that’s the default), which means each clip’s start timecode will be one frame after the previous clip’s end timecode. The timecode setting is dialed in to 07:00:00:00, which as you can see corresponds to the fact that these shots are all from scene 7. Finally, the main format option is set to “Original + Timecode,” which means Grinder is going to copy these files from their original directory into the directory of my choosing, adding timecode to them in the process. They will not be transcoded; they’re going to stay in their camera-native H.264 format. That means this process will go real fast, basically as fast as your hard drives let it.

Do this once for each set of shots you want to stripe with continuous timecode.

Once Grinder’s done, you’ll end up with a bunch of clips that all have “_main” appended to their file names. I personally do not care for this, though I haven’t figured out yet how to get Grinder not to do it. So I just let it do its thing, then use Automator to strip the “_main” suffix off.

Transcoding

The next step is to convert all those now-striped files to the offline editing format of your choice. I use Avid here, so I obviously want DNxHD 36. (You don’t even have to use HD here; you could use SD instead to make things even quicker, but that introduces complexities in the conform, so I don’t bother. I just stick with the same frame size and frame rate at which I’m finishing.)

Now, you can do this with just Media Composer. AMA your striped H.264 clips into a bin, select-all, then use the “consolidate/transcode” function to transcode to DNxHD 36 on your media drive of choice. Media Composer will convert the sound and picture and write them into your Avid Media Files folder.

But I prefer not to do it that way. See, I run Media Composer on my personal laptop, and transcoding takes time — even when you use all the shortcuts available to you — and I don’t want to tie up my laptop for hours or even days just for that. So I choose to use a program called Episode, from Telestream, to do the job, running it on a little Mac mini I have just for stuff like this. I have a custom encoder set up that converts whatever sources I feed into the job into DNxHD 36-format QuickTime movies. As you probably know, Avid doesn’t work with QuickTime movies; it stores all its media in MXF format. But Avid can do what’s called a “fast import” in situations where the source file is in the right codec. It simply copies the frames out of the QuickTimes and right onto your media drive. It’s fast and easy. (Well, the fast-importing is fast. The transcoding takes as long as it takes on the hardware you have. Since my hardware is just unbelievably modest, it takes a really long time for me, but I don’t care, because it’s a fire-and-forget kind of thing, and also I can’t afford anything better.)

Another key benefit there, in addition to getting your transcoding off your Avid and onto another system, is that you still have your DNxHD-format offline QuickTimes just sitting there in a folder. Because the format’s highly compressed, they don’t take up that much room — not in this era of terabyte-plus FireWires — and if you ever end up without your media drive, due to a drive crash or just inconvenience, you can batch import the files right back in again. Avid even remembers the paths to the original QuickTimes, so you don’t even have to find them. Just do a batch import and hit “okay,” and you’re laughing.

Of course, this extra convenience comes at a price. Episode is either five hundred or a thousand bucks depending on which one you buy. I already had the Pro version — bought it a few years back for a paying job, passing the cost on to my client — so I get to use it for free, essentially. If you don’t like that, you can just use MPEG Streamclip instead, though you’ll want to spend some time diddling around with the settings to make sure it’s writing out correctly formatted DNxHD QuickTimes to fast import.

(Oh, another note: Download and install the Avid codecs package. It’s free, and lets you both play back and encode DNxHD on any Mac.)

However you do it, get all your sources transcoded to offline format, then import them into your bins. If you’re not using script-oriented file names, you’ll want to take the extra step of logging the scene/shot/take information in Media Composer so you can stay organized. You can do that while you’re reviewing the footage.

Edit

Do I really have to explain this? Do your damn job. Create a rough cut by picking takes and laying them out in script order, then refine it until you’re happy. Just go … I dunno. Be an editor.

However, bear this in mind: You are editing right now. You’re not mixing sound. You’re not doing visual effects or compositing. You’re particularly not doing any color grading. By the time you’re done, you should have only one video track on your timeline (unless you’re using a gap effect on V2 for your timecode and metadata burn in for review and approval, which you should be doing, but that’s another conversation). Keep your eye on the ball here. This is the part of the workflow where you’re an editor and not anything else.

Export a linked AAF for conforming

Once you’re done with the edit and have a locked timeline — for whatever working definition of “locked” applies in your situation — you aren’t done with the job. Your next step is to output an AAF. An AAF is kind of like an EDL; it’s a machine-readable timeline that you can import into other programs to do other things. In our case, we’re going to import it into Resolve to do our conform. More on that later. For right now, we’ll just focus on getting it exported correctly.

The first thing you need to do is commit any group edits you might have on your timeline. I mention this specifically because it will screw you up in the next step if you’re not careful. Subclips are fine, such as those created by AutoSync, but group and multigroup edits must be committed with the “commit multicam edits” function before you export.

Media Composer comes with about a zillion export presets, but they confuse and scare me, so I always create my own. I just call it “AAF (edit protocol),” because it creates an AAF with the edit protocol option enabled. I’m not creative.

This is how I set up the video part of my export preset:

Export settings 1

And here’s how I set up the audio part:

Export settings 2

The key details there are that I turn all of the rendering options off. I don’t want Avid doing any rendering at all, because I’m going to deal with all that stuff later.

Exporting an AAF with these settings takes like zero time. It’s effectively instant. Which is good, because we don’t like waiting, do we?

Export an AAF for sound

Now is the time to talk about the small matter of sound. I’m going to assume that you, like me, are an editor who sucks at sound. Seriously, I’m just terrible at it. Every time I try to do any kind of audio sweetening or add filters or effects I end up making everything sound like it’s being heard over a telephone line. Underwater. During the Battle of Britain. I’m just awful at it.

That’s why I always send any real sound work out to the experts. And what the experts always want is either an OMF or an AAF, but AAFs are easier to deal with. Here’s how I export an AAF for use in ProTools:

Export settings 3

But note: These may not actually be correct or sensible! I just do what’s worked in the past. Talk to your audio guy about what he or she needs. Because seriously, your audio guy is your best friend. We couldn’t survive without those folks. So be nice, be accommodating, and send ‘em Christmas cards or something, jeez.

Conform in Resolve

Okay, so now that we’ve locked picture and output AAFs for picture and for sound (and hopefully said please and thank you when delivering the audio AAF to the sound guy), it’s time to conform our timeline to our high-res media.

But hang on a second. We didn’t actually make high-res media. All we did was stripe our original H.264s with timecode, then convert them to low-res DNxHD files for offline editing. How can we conform to the high res when we didn’t actually make high res?

That’s where Resolve comes in.

See, Resolve is a full-featured system for doing DI — that’s “digital intermediate,” which is now the jargon term of choice for conforming, grading and just generally finishing the picture on a project. We’re going to criminally underutilize it … but that’s okay, because it turns out Resolve is free.

That’s right, Blackmagic Design, the company that makes the product, perhaps unwisely chooses to give away a “Lite” version — no 2K or up, no multiple coprocessor boards, etc. — for literally no money, and you can download and install it on any Mac you have. Can’t beat that with a stick.

But of course, it couldn’t just be that easy. It turns out Resolve is actually quite tricky to use. It’s got a learning curve. I’m not going to try to tell you everything there is to know about Resolve here; I’m just going to tell you what works for me. Refer to the manual for more information, cause I literally don’t know anything other than what I’m about to say.

The first step — after you’ve set up your project and all that; see the manual — is to go to the Browse screen and load your original, striped media files into Resolve’s media pool. This is pretty straightforward; you just point the program at the files — which is not straightforward, but again, see the manual — and tell it to load all the clips into your pool. Poof, done.

Next, you go to the Conform screen and load the AAF you exported earlier. Resolve will bring in the timeline and automatically link it — based on the source file names — to the clips you loaded into your media pool.

Now, there are a couple gotchas here that you should take pains to avoid. First and foremost, Resolve will not be able to link your timeline up to your clips correctly if your clips don’t have timecode on them. That’s why we put timecode on way back in the beginning, with Grinder. It’s essential.

Secondly, Resolve cannot import AAF files that include grouped clips. I mentioned this before. So commit your multicam edits before exporting your AAF.

And third, Resolve really wants the camera media files to have the same file names as your offline media files. If you used the AMA consolidate/transcode method of importing your media to offline, you don’t have to worry about this; Avid is smart enough to keep the source file parameters in sync for you. But if you batch-transcoded, like I prefer to, you need to take extra care that your offline and online media clips all have the same file names before you get working. Otherwise you’ll have a headache when it’s time to conform.

Assuming you avoided all those pitfalls, you should basically be done conforming. Resolve does it all for you automatically, as long as your clips have unique names, as long as they have timecode, and as long as your AAF is sensible to the program.

Render ProRes files

At this point, if you were a colorist, you’d do color … stuff. You know, making things pretty and whatnot. You still can if you want, and more power to you if you do, but that’s not actually why we’re here. We’re here to render these conformed clips out to ProRes files that we can then work with in After Effects.

A little math here: The media files you get off a DSLR are compressed with the H.264 codec and hover somewhere around the 50-megabits-per-second range. ProRes 422 LT is a QuickTime codec that hovers around the 100-megabits-per-second range. In essence, this means we can take the frames out of that 50 Mbps sack and put them in a 100 Mbps sack with no loss of quality; ProRes 422 LT has the headroom to reproduce exactly, pixel for pixel, what your camera gave you.

That would not be true if your camera were spitting out something like R3D media, just to pick one example. R3D media files have more data in them than ProRes 422 LT files can hold, so you can’t convert R3D to ProRes 422 LT without suffering some loss of quality. (Whether that loss of quality matters to you, or is even noticeable to you, is a conversation for another day.)

ProRes 422 LT is also fast, fast as lightning even on a modest system. That means we’ll get to spend more time working and less time waiting when we get into After Effects for our online.

So we’re going to render out ProRes 422 LT files of all our selects. Resolve makes this trivially easy. I mean seriously, it’s almost one-button easy. But we’re going to push a few extra buttons just to smooth some things out.

Start by going to the Color screen and hitting ⌘-R, which is the shortcut for “render.”

By default, Resolve wants to render your whole timeline. That’s good; that’s what we want. But you can also tell it to render just one shot, or a timecode range, if you prefer. That’s helpful for situations in which you rendered out a couple hundred shots but one of ‘em had a problem. It’s easy to just click the one you need to rerender and be done with it.

Here’s how I set up my renders:

Resolve

First thing to note is the rendering mode toggle: It’s set to “Source,” not to “Target.” The target mode renders out the whole timeline as one long … whatever. DPX or EXR sequence, usually, or QuickTime movie in this case. But we don’t want that. We want each individual shot on the timeline to be rendered out as its own thing. Hence we set it to “Source.”

Next, file naming. You probably used different parts of the same take on your timeline. You start on this guy’s close-up, cut away to something else, then cut back to the guy’s close-up again. If that particular take is A001_C003 or whatever, by default Resolve will render out the frames from the first use of that take to a QuickTime called A001_C003.mov … then later come back and render out the frames from the second use of that same take to a QuickTime called A001_C003.mov again, overwriting the first one! That’s lame, but there’s an easy workaround: tick the box next to “Render clip with unique filename.”

Next, handles. Handles are extra frames on the head and tail of each shot. They’re purely optional; you don’t require them for this workflow. But I always include them, because it doesn’t cost much (a few seconds here and there in the rendering, a few megabytes here and there on disk), and I feel safer knowing they’re there if I need them.

Make sure your frame rate and output type are right (Resolve defaults to 24 fps, and 10-bit DPXs), then give it a destination and hit “render.”

On my little Mac mini render node, which is just about the slowest possible computer for this, I get between five and ten frames a second. Which isn’t bad, frankly. It could be a lot worse. In my environment, I know that a ten-minute timeline will take a bit less than an hour to render. That’s a known quantity, and I can plan for it. If you have more computer to throw at it, of course it goes faster.

Anyway, once all your shots are rendered out to ProRes, the last step in the conforming process is to export an XML from Resolve that links to the new media files. This is trivially easy: Just go back to the Conform screen, and right next to the “Load” button you used to bring your AAF in in the first place is an “Export” button. Give it a name and path, and choose XML from the list of drop-downs, and you’re done.

Online in After Effects

Now that you have rendered media with handles and an XML, we want to pop over into After Effects. In advance, you should’ve installed the Pro Import AE plugin from Automatic Duck. It’s a (previously $500, now free) plugin that lets After Effects create compositions from AAF or XML timelines. It links to whatever media files the AAF or XML file points to, and gives you a comp that you can start working on to do color or VFX or titles or whatever needs doing. It’s handy as hell.

Before you do anything else, though, you need to set up your color space correctly. If anybody knows how to make After Effects default to these settings, please let me know, cause I’m sick of changing them every damn time. Anyway:

After Effects color settings

This should be pretty familiar to anybody who uses After Effects: 32-bit color precision, sRGB working space, and linear-light compensation. Just get used to having those settings on all the time.

Now that that’s done, import the XML file you created out of Resolve. It’ll automatically link to your rendered ProRes files.

Once that’s done, you’ll end up with a timeline that looks something like this (though of course more complex and interesting, because this is just a simple example):

After Effects timeline

One layer per shot, and note that each shot has handles on it like we told Resolve to make. That way if an edit needs to be rolled a few frames in the online, you have the flexibility to do that.

Now go be an online artist. Do your color-correction (I like Colorista for this) and VFX (Mocha AE is a damn good planar tracker) and titles and credits (pro tip: lay them out in InDesign [I know!] and then export PDFs; they’re easier to work with by a mile than the After Effects title tool).

Because you’re using ProRes media files, courtesy of Resolve, After Effects is gonna go just about as fast as it can go. It’s not gonna be as real-time-interactive as something like Nuke would be — cause Nuke is practically supernatural — but it’s gonna be quick.

Render out your final

Once you’re done being an online artist, it’s time to pick a format to render out to. What you’re producing now, out of After Effects, are your final, finished frames. (We’ll add the finished audio mixdowns later.) So you basically have two useful choices: a ProRes 4444 QuickTime, which is 12-bit gamma-encoded and lossy-but-not-very, and a half-float Piz-compressed EXR sequence which is lossless. EXR sequences are better in basically every way; they’re lossless but fairly small, at only about 6 MB per frame for HD, and they’re sequences which makes it easy to rerender just individual frames if you need to. ProRes 4444 QuickTimes, however, have their own advantages: They’re much smaller than EXR sequences, and they can be easily brought into Final Cut Pro to lay on final audio mixdowns (see below).

Me? I’m basically messed up in the head. I render EXR sequences of my final timelines … then I bring those EXRs back into After Effects and output them as ProRes 4444 QuickTimes. Why? Because I like having those sequences as my final work product. I know I can always go back to them if I need to without having to reopen After Effects (or God forbid, go all the way back to Avid) and hope the media linked up and my plugins are licensed and all that. But really, we’re getting into matters of personal taste here, so I’ll just say that’s how I do it and leave it at that.

Lay down the audio

So now you have your ProRes 4444 QuickTime (made from your EXR sequence master, if you’re cool like me), and the audio mixdowns your sound guy has graciously sent you. What to do? Well, the best tool for this job is an old one you may or may not still have laying around: Final Cut Pro.

See, Final Cut Pro has this weird ability that no other similar tool has: It can read the frames out of a QuickTime file and then write them back out to disk without ever decoding them. Meaning you can drop your picture-only ProRes 4444 master file on a Final Cut Pro timeline, sync up your audio mixdowns, trim off your slates or two-pop or whatever else you might need to trim off, and then export the result without re-encoding the actual frames. That means the process is very fast, and more importantly, completely lossless.

If you don’t have Final Cut Pro at your disposal … well, I do, so I’ve never bothered to come up with a workaround. I guess you could put your audio in with After Effects when you convert your EXR sequence to ProRes 4444 the first time. That gets into implementation details that are beyond the scope of this little blahg, so I’m just gonna leave figuring that part out in detail as an exercise for the reader.

Anyway, long story short … you’re done. Your final product is whatever deliverable you needed to produce, with an EXR sequence (or ProRes 4444 QuickTime if you’re a loser) as an intermediate master product you can always go back to if you need to in the future. It’s good stuff.

Variations

Now, the whole thing with Resolve seemed kinda needlessly complex, didn’t it? I mean, After Effects reads H.264s natively, right? Can’t we just bring an AAF into After Effects and do our conform there, linking to the original camera H.264 files?

Well, yes and no. You can do a workflow quite like that, but you don’t do your conform in After Effects. You actually do it in Avid, after you lock picture but before you make your AAF. You do it by AMAing all your takes in in their H.264 form, then relinking your timeline to those AMA files. Then you can export an AAF which After Effects will read directly, linking automatically to the camera QuickTimes.

This is quirky, though. First of all, for whatever reason my combination of After Effects and Pro Import AE read AAFs slightly wrong; it reads them as having a non-square pixel aspect ratio. That’s easy to fix — you change the PAR in the composition settings for your timeline, then you reinterp one of your camera QuickTimes as square-pixel and copy-and-paste that interpretation onto the other QuickTimes — but it’s a thing you have to do, and I like not doing things more than I like doing things.

But the bigger issue is that After Effects is slow to read H.264s. It’s just sluggish as hell. If you’re trying to be creative — not editing-creative now, but like color and visual effects creative — you don’t want slow. You want fast. So having your media be ProRes is better than not.

If that’s the case, though, why not batch-transcode everything to ProRes right out of the gate? After all, the first tool we used was Grinder, an application designed specifically for that. Can’t we just grind everything to ProRes?

Sure, you can … but that takes time and disk space that we don’t actually have to invest. Doing your conform in Resolve and then rendering out QuickTimes of just your selects (with handles) is a lot faster and more parsimonious than transcoding a zillion frames hardly any of which you’re going to use in your edit.

But the more important benefit of using Resolve this way is the flexibility it gives you. In this little story, I described writing out ProRes 422 LT QuickTimes as my online format. I didn’t have to do that. I could’ve written out DPX or EXR sequences instead. I chose not to specifically because I’m onlining in After Effects here, and After Effects is kind of a little bitch about linking to image sequences. If you wanted to use sequences in After Effects, you’d need to do a second conform there, by hand this time, overcutting your AAF timeline with sequences for each shot. Doable, but tedious in the extreme.

But what if you’re not onlining in After Effects? What if you’re onlining in Smoke? No problem. Just have Resolve spit out 10-bit DPXs instead. Those go straight into Smoke, and you’re laughing. Or what if you are onlining in After Effects, but you need to send like three shots over to a VFX guy who’s going to do some work for you to incorporate later? Easy. Just render everything to ProRes, like we talked about here, but then render just those shots out of Resolve to DPX or EXR sequences and send them to your VFX guy. He works on your frames and gives you back similar sequences as his finished product, which you overcut into your online timeline just like normal.

Now, I’m not saying this workflow’s for everybody. It’s highly idiosyncratic, and tuned to my particular set of needs using my particular set of tools. But it works well for me, and I thought maybe somebody might get something out of reading about how I do stuff. If nothing else, people smarter than I can get frustrated and send me vitriolic emails about how stupid I am, and then I’ll have learned something.


Mar 31

Ghetto brute-force color temperature correction

I published when I meant to save as a draft. There’s not actually supposed to be anything here.


Mar 22

Newton’s third law

This is not advice, and it’s not advocacy. I’m not going to try to convince you of anything. This is just me writing down what’s in my head, talking about a decision I’ve come to on my own for my own reasons.

So no more hate mail, please. Seriously.

A couple days ago I wrote at some length about a very serious problem I ran into with Adobe Premiere. Short version? Premiere incorrectly computes 24p timecode under certain conditions. Certain not-ubiquitous-but-incredibly-common conditions, it’s worth pointing out.

I got some surprising responses to that little blahg.

First, the “community” — hate that word, but I don’t have a better one — of post professionals who use Premiere apparently went kinda apeshit. It is my understanding, through secondhand knowledge, that the timecode bug became a very hot topic on the Premiere CS 6 beta mailing list, with opinions of all sorts being flung about, and an eventual consensus emerging that took the form “For the love of all that’s good and holy in this world, fix this fucking thing.”

I’m paraphrasing, of course.

Second, I got a larger-than-I-would’ve-expected number of emails asking me why anybody should care about an antiquated concept like timecode in the modern era of digital cinematography anyway. I said some things about that yesterday, and I won’t repeat myself here.

The third response I got from my little blahg was this: I got a call from a senior Adobe Premiere developer. We had a nice chat on Tuesday night about the problem. As I sat here on my system, he sat there on his, and we reproduced the problem together. His initial reaction was, in short, “Huh. That’s weird. Lemme call you back.”

Now, to be fair, I shouldn’t have been surprised by this. Adobe has cultivated a reputation over the past year or so for being a company that does stuff like that. Stuff like call up random Internet weirdoes who gripe about their products to chat with them in a friendly and helpful way. That’s apparently a thing Adobe does now, and I’m all for it.

So when that guy — I’m not naming names or being specific about job titles out of respect for people’s privacy — called me, my reaction was less “Woah, wow, this is unprecedented!” and more “Yup, that’s par for the course for these guys, let’s see what happens next.”

Cause see … openness is neat. Being transparent with your customers is neat. Establishing a culture of two-way communication between customer and vendor is neat. But in the end, it’s really just all talk unless that openness and transparency and two-way communication results in products that actually work.

So the big question for me was whether this apparent glitch I’d found, first of all, was actually a glitch and not just me being stupid (that turned out to be a yes), and second just exactly how Adobe would end up fixing it.

The best-case scenario, of course, was that the developers hammered out a super-quick fix they thought would work okay, then threw a build with that fix up on the FTP site for me to download and try. That’s what I … okay, expected is too strong a word there, but that seemed like the natural outcome for me. According to an email I got shortly after our phone call, such a build actually existed internally at Adobe within hours of my posting my rant to the Internet. So like … halfway there, y’know?

Except that didn’t happen. Okay, fine, I’m sure the software guys have these elaborate regression-test rigs they use internally, and there’s a good chance the quick-and-dirty build this guy made to test the fix wouldn’t run outside that environment anyway, for dependency reasons. That tends to happen in these kinds of situations, so that’s okay.

But of course, that just leads naturally to the next question. Clearly the fix for this is going in version 6. That’s just obvious. Version 6 is in beta right now, and this is what the beta-testing process is for: identifying huge, show-stopping flaws and fixing them before the product is released. So … yeah. Duh. So that wasn’t in doubt. What was in doubt was how long it would take Adobe to release a patch for version 5.5 that fixes this critical bug.

And, more selfishly, whether that patch would drop this week so I could take advantage of it on the project that started this whole big mess.

Well, yesterday afternoon I got the answer.

And that answer was “no.”

Again, I will not name names or give direct quotes. But I’ll paraphrase as succinctly as I can: This bug, according to my new friend at Adobe, will not be fixed in version 5.5 or any earlier version. Did you see the big press release Adobe sent out announcing that CS 5.5 and all prior versions had been retroactively end-of-lifed and were now out of support? Yeah, me neither. We both must’ve missed it, because that’s what’s happened: If you are an existing Adobe Premiere user of any stripe, you are currently — whether you’ve had to be aware of it or not — working around at least one absolutely critical show-stopping flaw in the program … and that flaw will not be fixed. Adobe has the fix, they know how to fix it, they’ve even fixed it internally. They’re just not going to release that fix to anybody. Ever.

This, I think we can all agree, sucks. It sucks if you’re a Premiere user who’s had to deal with this bug, and it also sucks (though less empirically and more philosophically) if you’re a Premiere user who hasn’t yet encountered this bug.

But if we’re really unreasonably generous, we can step back and say … okay. With a heavy sigh. Ours is not a perfect world. There are ten software guys who work on Premiere — a number I made up just now for illustration purposes — and all ten of them are working overtime on version 6 which is to be released imminently. Taking them off of 6 and putting them on patch duty for versions 5 and 5.5 would push back the release date of the new version, and since the bug’s fixed in that version anyway, and since market research shows that every last one of their customers is going to upgrade on day one, rushing a patch out for software versions that are only going to be used for another two days doesn’t make any sense, economically speaking. Right?

Except there are some problems with that line of reasoning. First, version 6 is not two days away. I don’t know if anybody knows how far out it is, but it’s not shipping on Saturday, that’s for sure. It’s probably most reasonable to guess that it’s months away. They’ll probably show it off publicly at NAB, then ship it in July or September or something, would be my guess. So that’s anywhere from weeks to months (depending on whether you’re in the beta program) of having to live with this same critical bug.

Next, versions 5 and 5.5 aren’t going to evaporate like morning dew in the swift sunrise of version 6. Will some people upgrade immediately? Doubtless! Will others upgrade later? Of course. And some — many, I think it’s safe to say — won’t upgrade at all until a good reason to arises.

A good reason like a critical bug existing in the version they’re currently using, and the version 6 upgrade being offered at a discount for a limited time so you better act now have your credit card ready?

Perish the thought.

Point is, “It’s fixed in the paid upgrade” only applies to people who buy the paid upgrade. As much as I’m sure Adobe wants that to be everybody, it’s not going to be. Hell, one of my housemates still uses CS 3! Why? Because he hasn’t run into any critical, show-stopping bugs.

Yet.

So I’m understandably unsatisfied, I think. Here’s a really bad bug. Here are simple steps to reproduce it. Here’s me on the phone with you walking you through it. Here’s you fixing it within hours. And now here’s you saying you won’t release that fix except as part of a paid upgrade that isn’t available yet.

In other words, Adobe is essentially saying “We’re very sorry you had a horrible experience with our product. Please give us $600 now.”

Not a great sales pitch, guys.

But of course, the story does not end here. Because you see, I held something back. Something I’m probably not supposed to say publicly, but I never signed an NDA, so I’m throwing caution to the wind on this one. I think it’s that important.

The thing I’m holding back is this: They’re not fixing it in version 6 either.

Yeah. That’s right. Not only do they currently have a fix for the current version in house, according to what I was told, which they will never release, but they also won’t apply that fix to version 6 before it ships. Instead, they may — this was not a commitment, but merely a statement about what is possible — include the fix in some future, unspecified, currently unplanned patch release to a version of the software that hasn’t even shipped yet.

Critical bug. In a foundational part of the software. Makes the software completely unusable in many common workflows. And they’re just not gonna bother to fix it now, and they’re just not gonna bother to fix it next time around, but they might choose to fix it by Christmas. If you buy the paid upgrade. Which you already fucking know will still be broken.

I said up front this is not an attempt at advocacy. I’m not trying to convince you of anything. I’m just telling you what I think.

And I think I’d be ashamed.

I’ve never walked a mile in the shoes of any of the Adobe guys, fair to say. But I’ve walked plenty of miles in shoes that look and feel awfully similar to theirs if you squint a little. And I sincerely think I’d be ashamed if I were in their position.

But there’s no sense crying over it. It is what it is, and now it’s up to me to decide what to do about it. And while it’s still all very new, my tentative decision is this:

I’m just not going to use Premiere any more. Period.

I was emailing back and forth with a good friend recently — hi, K., awkward and dorky wave — and she really put it perfectly. She said it’s about trust. Neither of us trusts Premiere. If we drop a piece of footage into Premiere, we cannot trust that Premiere is giving us the right timecode, or the right frames for the timecode. Maybe it is and maybe it isn’t, because we know for a fact that sometimes it doesn’t. And at least for me, that’s as good as never getting it right, because the seed of doubt has been planted.

Yes, Premiere has some neat things going for it. But you know what? We lived without real-time playback of compressed media for decades. We can live without it again. And After Effects integration on the timeline? Sure, it’s kind of neat, kind of a time saver, but it’s not like it’s the best thing ever. Especially considering the overwhelming majority of working editors out there often ship their After Effects work over to another guy so they can focus on editing anyway. It’s handy to be able to bounce a shot over to AE with a couple clicks, but it’s not like that’s the only way to do compositing or whatever.

So the bottom line is that all the neato-whiz-bang features in the world can’t make up for a basic, fundamental lack of trust. I don’t trust Premiere. My friend K. doesn’t trust Premiere.

I don’t think you should trust Premiere.

But like I said, I’m not trying to convince you of anything. I’m just telling you what I think.


Mar 21

But Jeff, why?

The blahg I blahged yesterday was really intended for an audience of one. So I didn’t go out of my way, when writing it, to explain a lot of the rationale behind the various givens that went into it.

Since I posted it, I’ve gotten some entirely reasonable and sensible questions about some of those givens. So lemme share a few thoughts not about the problem I ran into yesterday, but rather the rationale for doing things the way I do them that led me to that problem in the first place.

Let’s start by talking about film. You remember film, right? Probably read about it in books. It used to be this way of recording moving pictures.

Film workflow is fairly standardized, even in this day and age. Unexposed film negative comes on reels which get inserted into a special piece of camera gear called a magazine. This magazine gets attached to the camera, the film runs through it to be exposed, then the exposed (but still undeveloped, and thus light-sensitive) negative gets taken up on another reel inside the same magazine.

A film magazine, therefore, can be thought of as a contiguous length of film negative, including (in most cases) many takes of the same scene. Due to the nature of how film cameras work, each take buts right up against the take before it and the take after it; there’s no “blank space,” as it were, between takes on the same reel.

Thus we have our basic terminology: mags, rolls and reels are all essentially the same thing. They’re contiguous pieces of footage straight out of the camera.

When shooting on videotape — remember videotapes? — the same basic principle applies. You put a blank tape in the camera, roll a few seconds or minutes, stop the camera, reset to one or whatever, then restart the camera and roll a bit more. At the end of the day, you have a tape with multiple takes on it, one right after the other. Thus, you often find that in editorial the terms mags, rolls, reels and tapes are used interchangeably.

But let’s go back to film for a second, because clearly something has to happen between the part where the images are on film and the part where they’re in your computer being edited. Film reels — mags, rolls, whatever — are typically transferred to videotape all in one go. We said earlier that each reel has multiple takes on it; these takes are transferred to tape in one long pass. But here’s the cool part: The timecode on the tapes is linked to the reel number of the film being transferred.

For instance, reel one gets put on a tape that starts at 01:00:00:00. Reel two gets put on a tape that starts at 02:00:00:00. And so on, right on up to 23 (or 24 if you really want; that would be 00:00:00:00, which most people are disinclined to use for a wide variety of reasons).

Why do it that way? Because if you’re dealing with 23 or fewer camera reels, then every single frame of raw, unedited footage you’re working with gets a unique timecode number. All the stuff from reel six has timecode that starts with 06; all the stuff from real 13 has timecode that starts with 13. No two frames in the entire project have the same timecode.

Okay, but why is that important? Because it eliminates ambiguity entirely. In a setup like that, you’ll never find two frames numbered 03:22:57:06. There’s only gonna be one frame with that number, period, end of paragraph. That means when your EDL (or whatever your end product is) refers to frame 03:22:57:06 there’s no chance of getting that wrong — well, I mean aside from all the obvious technical and human-error problems that can come up, but we’re pretending both we and our tools are perfect for purposes of this discussion.

Of course, in the real world there are plenty of shows that don’t have 23 or fewer reels. In those cases, you need to track things like reel number, tape number and all that separately. NLEs typically use a combination of reel number and timecode to uniquely identify frames; you might have more than one frame with timecode 03:22:57:06, but only one of those frames is gonna be on reel 27. The others are on reels 81, 136 and 2,768 respectively. So in those cases, it’s the combination of reel number and timecode that uniquely identifies a frame … and if you’re a database geek, you’re probably thinking in terms of compound primary keys right now, and that’s good, because under the hood that’s really what we’re talking about.

But once again you ask, why go to all this trouble? What’s the big deal?

The big deal is this: Fundamentally being an editor is about communication. I don’t mean in the corp-speak “synergize our core competencies” sense of the word. I mean it literally: It’s about communicating a list of pictures. At the end of the day, the editor’s job is to say, “Gather round and attend, for here is the list of pictures that make up our show: From the reel of our Lord 37: picture the fifteenth, picture the sixteenth, picture the seventeenth, picture the nineteenth (for we removeth one frame to make the punch land harder), picture the twentieth” … and so on and so on and oh my god kill me now.

That’d be a valid way of communicating a list of pictures. It’d also be a completely stupid way of communicating a list of pictures, because it’d take so long we’d all be dead by the start of the second act. So instead we use shorthand. Shorthand like this:

001  8   AA/V  C        08:21:19:00 08:21:19:09 01:00:00:00 01:00:00:09
002  4   AA/V  C        04:00:15:00 04:00:16:02 01:00:00:09 01:00:01:11
003  7   AA/V  C        07:01:00:13 07:01:01:07 01:00:01:11 01:00:02:05

You may or may not know what you’re looking at there; depends on whether you got into editorial before or after, say, 2005. That there is an EDL: an edit-decision list. It’s a very simple way of communicating a list of frames. It breaks down into columns like this:

  • Event number
  • Source reel
  • Tracks
  • Edit type
  • Source timecode in
  • Source timecode out
  • Record timecode in
  • Record timecode out

Remember how we talked about the combination of reel and timecode uniquely identifies a frame? That’s why we can say “8 08:21:19:00” instead of “that shot of that girl looking squinty, no, not quite that squinty.” The reel-plus-timecode points to exactly one frame in the entire job, so we know once we’ve found a frame with those identifiers on it, we’re looking at the correct frame.

"Okay, fine, whatever, old man. All this talk of film cans and EDLs … it’s like we’re back in the dark ages. What does this have to do with me, a hip young editor with a 7D and a dream?"

It’s just this: Some cameras, the 7D being chief among them, do not record either reel numbers nor timecode while they’re running. In that example above, the first event in the EDL refers to reel 8, timecode 08:21:19:00. That identifies a unique frame; no other frame in the whole project has those identifiers on it. But those identifiers are only there because I put them there by hand. When that frame came out of the camera, it was reel 0, timecode 00:00:00:00. And since there are about 400 shots in this show, I had 400 different frames on my computer that were all called reel 0, timecode 00:00:00:00. They were all in different QuickTime movies — MVI_2237 or whatever — but they all had the same timecode and reel numbers. So if you said to me “reel 0, timecode 00:00:00:00,” I’d have to go “Which one?” Cause I had hundreds of those exact frame.

Remember: Being an editor is about communicating. And communicating is about being unambiguous. Two frames called “0 00:00:00:00” is bad enough; four hundred of them is absolutely intolerable.

So I grouped the QuickTimes into folders by scene — the scenes and shots fortunately having been coded for me in the shooting script — and then named them by scene, shot and take. So the QuickTime corresponding to scene 8, shot 8C, take 3 on the A camera would end up being in a folder called “8” and named “8C-3-A.”

Why code the takes this way? Because it makes it possible for me to sort them in alphabetical order and stripe them. That’s where QTchange comes in. Point it to a folder of QuickTimes and it’ll put reel numbers on them, first of all, but it also puts timecode on each shot. As I wrote previously, you can use time-of-day (or “free-run”) timecode, but I’m now even more inclined than I was before to use rec-run timecode instead. So you start with a list that looks like this:

Name          Reel    Start          End
8A-1-A.MOV    0       00:00:00:00    00:01:26:15
8A-2-A.MOV    0       00:00:00:00    00:01:59:03
8A-3-A.MOV    0       00:00:00:00    00:01:03:12

And you end up with a list that looks like this:

Name          Reel    Start          End
8A-1-A.MOV    8       08:00:00:00    08:01:26:15
8A-2-A.MOV    8       08:01:26:16    08:03:25:18
8A-3-A.MOV    8       08:03:25:19    08:04:28:13

(Note: I did that timecode math in my head just now, so odds are it’s all wonky. Take it as an illustration rather than a real example, kay?)

The first list should make any editor with any experience under his belt feel like throwing up. The second list, by contrast, looks totally normal: The timecode hour is the reel number, and the timecode is sequential and monotonic. Every frame is uniquely identified by reel number plus timecode for that frame. All is well. We can communicate.

Now, given the number of people who’ve asked me what this whole deal is about since yesterday, I can only assume that by doing things this way — a way we could charitably, think, call anal-retentive — I’m a weirdo. I dunno what to tell you, man. I’m not that experienced as a professional editor, but I’ve been editing in various non-professional and on-the-fringe-of-professional capacities for more than a decade now. In that time, I’ve had maybe three experiences (not counting yesterday) where a timecode or reel-numbering screwup caused me serious headaches. But those headaches were so serious and so painful that I now have a deep-seated Pavlovian response to this kind of thing. Seeing “00:00:00:00” anywhere gives me an immediate tummyache. Not having sensible timecode everywhere makes me feel like throwing up. Because I know if anything does go wrong — and fair point, it might well not — it’s gonna be a fucking catastrophe that probably involves my staying up all night three days in a row to redo a whole lot of work that I could’ve just taken a little extra time to do correctly in the first place.

So … yeah. My take on this stuff is somewhere between the voice of experience and the paranoia of the truly mentally ill. I make no implications about where along that spectrum it falls.


Mar 20

An update regarding counting

After my previous blahg I received, through a variety of channels, a significant number of comments of the form “Dude, QTchange? What the hell is that? Clearly that’s your problem right there, some weirdo third-party utility thing.”

To which I have three things to say: First, you guys aren’t wrong. Whenever a problem arises, the weirdo third-party utilities are the prime fucking suspects. You’re absolutely right to go there first.

Second, however, QTchange isn’t some weirdo bit of freeware nobody’s ever heard of. It’s absolutely standard kit for striping timecode-less QuickTimes — such as those that come out of DSLRs — to make them more useful in NLEs. If you’ve never had to use it, great, that’s awesome for you, really. But it’s not some obscure bit of whatever, either.

And third, I reproduced the problem without using QTchange at all … or in fact using any footage whatsoever. Here’s whatcha do:

  1. Open up Avid Media Composer. I’m using 5.5.3 here, but it shouldn’t matter.

  2. Create a new project: 1080p23.976.

  3. Create a new sequence in your default bin. Do nothing to it. Don’t edit anything into it at all; just leave it completely blank.

  4. Export that sequence — yeah, the one with literally nothing in it — as an AAF.

  5. Open up Premiere. I’m using 5.5.2 here, which I believe is the latest version. At least it better be, since I just ran Adobe’s updatemonster thing on Friday.

  6. Create a new project. Settings shouldn’t matter here, but for the record I went with whatever the defaults are for everything.

  7. You’ll be prompted to create a new sequence. What you pick here doesn’t matter at all, because see next step.

  8. Delete that first sequence Premiere made you create. You should have a completely empty bin.

  9. Import the AAF you exported in step 4 above. It will come in just as it should: A new folder in the bin, and inside that folder a new sequence, and nothing else. Duh, because you didn’t edit anything into that new sequence; it’s just an empty timeline.

  10. Double-click that newly imported sequence.

  11. Observe that the start timecode of the new sequence, rather than being 01:00:00:00 like it should’ve been — like it was in Media Composer — is now 00:59:56:09.

  12. Refer back to my previous blahg in which I made note of the fact that a QuickTime striped with QTchange to have timecode starting at 01:00:00:00 comes in — according to Premiere — starting at 00:59:56:09.

  13. Bemoan the fact that, at least as far as 24p timecode goes, Premiere apparently can’t count.

Somebody who’s less tired than I am should feel free to do the math to figure out exactly why Premiere thinks 01:00:00:00 = 00:59:56:09 and email me the results. I don’t care that much, unless it points to an easy workaround, but I admit to being idly curious nevertheless.


How saving a day cost me two days

So this friend of mine promised one of his friends he’d cut her short film for her. Life intervened and he found himself without the free time, so I volunteered to take over, since I don’t have anything else going on right now anyway.

Now, by “short film” here I mean short film. The script is only 17 pages, and it reads long, so it’s just not that big a deal. All shot on 7D mostly MOS, with just a handful of sync-sound scenes, and the deliverable is just a QuickTime. No biggie, right?

Well, naturally I wanted to cut it in Avid. Because Avid is my girlfriend. But see, the right workflow to do this job in Avid involves AMAing all the takes, then batch-transcoding them to DNx 36 to cut, then relinking back to the AMA media once the edit’s finished. Transcoding takes a long time on my humble laptop, and I didn’t want to waste a whole day just doing that. So I decided to save myself a day by doing the job in Premiere Pro CS 5.5 instead.

Now, this job should be right in Premiere Pro’s wheelhouse. Native DSLR media, not a lot of fancy stuff, no real compelling need to collaborate with anybody else … easy, right?

Yeah, not as much as you might hope.

I started the job like you always start DSLR jobs: By using QTchange to lay timecode tracks onto the media files the camera spits out. See, DSLRs like the 7D don’t record a timecode track at all, which makes editing a bit tricky later on down the line. This is easy to work around, though, because QTchange gives you the option of quickly and easily adding a timecode track to all your media files. You can either use the timestamp the camera recorded to be the timecode of your first frame, or you can ignore that and use incrementing timecode instead. It’s basically analogous to the difference between free-run and rec-run on your camera; you can let the camera’s internal clock set the timecode (free-run) or you can say mag 1 starts at 01:00:00:00 and counts up, mag 2 starts at 02:00:00:00 and counts up, and so on (rec-run). Totally doesn’t matter which you pick (as long as the free-run method doesn’t cross the midnight line), as long as you pick something and stick with it.

Me, I picked the free-run method, because why not, it’s easy. Couple clicks and it was done. Easy peasy. This turns out to have been an error, for reasons I’ll get to in a sec.

The next step was to import all my takes into Premiere Pro. Now, if you’re an Avid person I might need to clarify this: “Importing” into Premiere is not like “importing” into Avid. It’s more like “importing” into Final Cut Pro, in the sense that you just tell the program “Hey, use these files here,” and it does. It doesn’t actually move any data around or transcode anything. It just refers to the files already on whatever hard drive you’re using. So it’s a quick process, generally speaking.

What I should have done next is to double-check that Premiere was reading the timecode from my QuickTimes correctly. That’s what a fastidious and careful editor would’ve done. Cause the relationship between NLE timecode and source-media timecode is the heart and soul of editing; screw that up, and … well, you’ll see shortly.

Anyway, I loaded all my shots into Premiere (in the Premiere sense of “loading”) and started working. It was going okay, I got the first minute or so cut, then hit my first scene with sync sound. For reasons I won’t go into detail on here — involving Premiere Pro 5.5’s completely fucking stupid “Merge Clips” feature and the general impossibility of dealing sanely with sync sound in that program — I decided it was time to rethink my choice and consider cutting the project in Avid instead.

That’s where yesterday ended. Bad news: I had serious doubts about my ability to do the project in Premiere and stay sane at the same time, because of the sync-sound issue. Good news: I had a solid minute on my timeline, and moving that over to Avid should be fairly easy, with either an AAF or an EDL. So I slept well and soundly last night.

This morning, though … goddamn.

First thing: If you use the “Merge Clips” feature in Premiere — comparable-sort-of-but-turns-out-not-really to AutoSync in Avid — you can no longer export an AAF. Period. It’s right there in the manual and everything: No AAF or XML exports with merged clips. Well, shit.

That’s not that big a deal, though, because I’d only cut in about four sync-sound shots in one scene. I could redo those edits easily by hand in Avid. What I did want to bring over was the minute of shots before that scene that I’d spent most of my time on the day before. Surely I can just lop off the offending sync-sound scene and export the rest as an AAF, right?

Turns out … no. Not really. I can export an AAF all right, once I remove the merged clips, but that turns out to be way more complicated than it should be, because of a variety of things I won’t bother going into here because you’ll probably just say “Oh, you fix that by doing bonk” for each one, and you’ll have missed the point that by this time I just didn’t care any more, so rather than working the problem I just punted and went back to the stone age. I used an EDL.

God bless EDLs. They’re the most brain-dead, stupid interchange format in existence. Ever looked at one? There’s like no information in there. And that’s awesome because it means they’re so simple they’re damn near impossible to screw up. Right? Right?

You know how this goes. Do major amputations on your timeline: no dissolves, no V2, no none of that. Hell, just to keep life simple, strip off all the audio, because I can just replace that later with a couple quick edits. Next: export an EDL of the timeline. Take that EDL into EDL Manager, sanity-check it, then hit “send sequence.” Pick a bin, and poof, there’s the timeline, along with one 24-hour-long offline clip for each source reel that was referenced in the EDL.

Now for each source reel, use the “Modify” function to change its source to the tape name of your choice; this should match the tape name in the media you want to relink to. Click click, done, now hit “relink” and hope for the best.

Yay! Everything relinked correctly! That’s awesome! Except … erm … no, it’s not. Because it turns out these are the wrong fucking frames.

First frame of first shot in Premiere:

Premiere

First frame of first shot in Avid:

Avid

I’m looking at entirely different pieces of my shots, not the pieces I selected in Premiere when I edited this sequence the first time. Totally different frames.

And if you look real close at those two images, I bet you can figure out why. Have you spotted it yet? Give up? I’ll tell you, but brace yourself. Cause this is where it gets good. I mean really good. You ready for this?

Premiere Pro CS 5.5 cannot fucking count.

Uh-huh. You heard me. Look close at those screen shots. See how the two both show the same timecode? Like, judging by the timecode alone, you’d expect to be looking at the same frame. Same clip plus same timecode equals same frame, right?

Only nope. Because the timecode you’re seeing in the Avid screenshot is right, and the timecode you’re seeing in the Premiere screenshot is just plain wrong.

Remember up yonder when I said that using free-run timecode, based on the camera clock, was a mistake on this job? Here’s why: If I’d chosen rec-run instead, wherein the first shot on the first mag starts at 01:00:00:00, I’d have spotted the problem immediately. But I didn’t, because I didn’t care what the timecode actually was; I just needed sane timecode. So I threw some random time-of-day on there and forgot about it … which meant I completely missed the fact that Premiere was showing me — the whole time, and on every single shot in the show — the wrong timecode for every last mother-lovin’ frame.

First mag, first take, shot 1A-1-B, first frame:

Premiere

As you can see here, in big friendly yellow numbers, Premiere thinks the timecode of the first frame is 01:48:51:11. But here’s the first mag, first take, shot 1A-1-B, first frame in Avid:

Avid

Observe: Same frame. Like literally the same image recorded by the camera. Only Avid thinks the timecode for this frame is 01:48:58:00. Which it is. That’s the timecode QTchange put on that frame (I went back and checked). That’s the timecode QuickTime Player 7 displays (when you select the timecode track, which is not selected by default). As seen here, it’s the timecode Avid reads. It’s the timecode Final Cut Pro 7 reads — yes, I blew the dust off my FCP 7 icon just to check this. That’s the correct timecode. That’s the timecode in the file. It’s right there in big numbers, staring up at you: 01:48:58:00. No ambiguity, no complexity, no confusion. Just 01:48:58:00.

Only Premiere is all “LOL nope, 01:48:51:11.”

To try to make some kind of sense of this, I restriped that same shot with rec-run timecode instead, in QTchange. (Yes, I have a backup of the media files, what do you think I am, some kind of barbarian?) Now the timecode on that shot starts right where you’d think it’d start for mag one, take one: 01:00:00:00.

Premiere: “LOL nope, 00:59:56:09.”

What. I’m sorry, but what.

It’s probably some kind of bizarro fucked-up drop-frame thing. It’s 24-frame non-drop timecode on 23.976 media; totally normal. But Premiere is probably trying to be clever and show me drop-frame timecode instead … or some goddamn thing. If I weren’t so fed-up right now, I’d do some math to try to figure this out … but fuck it. I’m just not going to. I’m putting my foot down, Premiere. I don’t expect a lot of an NLE. I want a lot, I desire a lot, but if wishes were horses none of us would have to walk, so I take great pains to distinguish between what I hope for and what I expect. And you know what? I expect my NLE to be able to count. That’s all, that’s the absolute bare minimum. At the lowest level, editing is nothing more than counting: Count in this many frames, make a cut. Count this many more frames, make another cut. All the rest is just scotch tape and stuff. If you can count, you can edit.

But as best I can tell, Premiere couldn’t fucking count to eleven if it took its pants off.

I’m not a happy person right this minute. If you work for Adobe and you happen to read this — because you know, Internet and shit — then please accept my apology for saying in no uncertain terms that your NLE is a fucking abomination that should be pulled down brick by brick, and that we should salt the Earth beneath it so nothing ever grows there again. Later, when I’ve spent two fucking days starting over and getting back to the point where I thought I was last night, I’ll probably have calmed down enough to express myself more diplomatically. But for right now, no. For right now, I’m just pissed.

I’m pissed particularly because at this point I have a choice to make. I can either throw all the work I did yesterday, plus all the time I spent today trying to figure out where things went wrong yesterday, in the garbage and start completely over, just eating the fact that Avid will make me transcode all these takes to DNx before I can work with them in real time …

… or I can just go back to yesterday’s save of the project file and finish the fucking job in Premiere. Knowing that Premiere can’t fucking count, knowing that I’ll never be able to get any useful machine-readable representation of my timeline out of Premiere, knowing that if my friend or his friend wants to take the timeline I create and do something else with it — like ship it off for a high-quality finish for submission to festivals or whatever — then we’re all just fucked in the ear, because Premiere cannot fucking count.

And I don’t know what I’m gonna do yet. But one way or the other, I’m gonna be in a bad mood for a while.

An update: I wrote more words.


Aug 4

Going from Avid to Premiere: all the quirks that ever quirked

So I have this small project to do, basically as a favor to a friend. I decided to do it in Avid, since I have access to one for the time being and I need the practice badly. It’s really not much; my friend has about six hours of DV footage, shot a panel-type event. He wants it cut down to highlight reels. Easy.

Except he also wants a few artsy-fartsy things done to it. Lower thirds, titles, that kind of thing. Nothing big, but enough to make me need a miniature workflow pipeline to do the job.

My plan was simplicity itself. I’d import the footage into the Avid, make subclips based on my buddy’s paper notes, edit them together, output an AAF for each timeline, take the AAFs into Premiere, relink to the original QuickTime media, then use the Premiere-After Effects integration features to put my titles and supers on. Then render, save to a series of QuickTimes for delivery, and donezo.

So yes, it’s basically an offline-online workflow, as traditional as buttermilk biscuits. Should be no problem, right? After all, this kind of thing has been the norm since God was a boy.

Except yeah no. Quirks abound.

I did a little test today, just to make sure my workflow would turn out the way I planned. What I learned is that either the version of Avid I’m using — more on this later — has some bugs related to AAF export, or Premiere Pro CS5.5 has some bugs related to AAF import.

As you can see in yon pretty picture, I’ve got just about the simplest timeline you can imagine. NTSC, non-drop, starts at 01:00:00:00, got a half a dozen subclips strung out all in a neat little row.

Avid screenshot

This should be easy, right?

Well, the first thing you’ll notice, in the screenshot below, is that Premiere can’t count. Seriously. Non-drop timeline starts at 01:00:00:00, right? Premiere thinks it should be 00:59:56:12, which I’m sure we all recognize is the non-drop timecode equivalent of 01;00;00;00 in drop-frame.

Premiere screenshot

Okay, I mean, that’s stupid, but it’s fixable. It can be fixed. You can spend five minutes digging through the manual, give up, google it, then learn that there’s a hidden menu in the timeline pane of the Premiere interface that lets you set the start timecode of your timeline. Change it to 01:00:00:00. Fine. Whatever.

But then, and I refer you here back to that same yonder screenshot above, notice the weird thing. I have six subclips on my Avid timeline:

  • NAFA Tape 2 of 6.mov.Sub.02
  • NAFA Tape 2 of 6.mov.Sub.01
  • NAFA Tape 1 of 6.mov.Sub.01
  • NAFA Tape 3 of 6.mov.Sub.03
  • NAFA Tape 3 of 6.mov.Sub.02
  • NAFA Tape 3 of 6.mov.Sub.01

What I’d hope to see in Premiere — and this isn’t an unfounded hope; keep reading — is three clips in my bin:

  • NAFA Tape 1 of 6.mov
  • NAFA Tape 2 of 6.mov
  • NAFA Tape 3 of 6.mov

Link those to the original QuickTimes, and I’m golden. What I’d tolerate seeing in Premiere is this:

  • NAFA Tape 1 of 6.mov.Sub.01
  • NAFA Tape 2 of 6.mov.Sub.01
  • NAFA Tape 2 of 6.mov.Sub.02
  • NAFA Tape 3 of 6.mov.Sub.01
  • NAFA Tape 3 of 6.mov.Sub.02
  • NAFA Tape 3 of 6.mov.Sub.03

That is, one clip in Premiere for each subclip in my Avid timeline. I mean, that’s kind of stupid, really, seeing as how half of those refer to the same media as the other half. But, like, I can work with that, you know?

But no, instead I see both. There’s a clip for the actual clip not used on my Avid timeline but referred to by subclip, and there’s a clip for each subclip!

It’s not at all obvious how to relink those.

But wait. I’m not actually done yet. Something you can’t see in that screenshot — and I can’t really figure out how to take a picture illustrating this — is what’s actually on my timeline.

Just to review, this is how my Avid timeline looks:

  • NAFA Tape 2 of 6.mov.Sub.02
  • NAFA Tape 2 of 6.mov.Sub.01
  • NAFA Tape 1 of 6.mov.Sub.01
  • NAFA Tape 3 of 6.mov.Sub.03
  • NAFA Tape 3 of 6.mov.Sub.02
  • NAFA Tape 3 of 6.mov.Sub.01

Those subclips, in that order. This is how my Premiere timeline looks:

  • NAFA Tape 2 of 6.mov.Sub.02
  • NAFA Tape 2 of 6.mov.Sub.02
  • NAFA Tape 1 of 6.mov.Sub.01
  • NAFA Tape 3 of 6.mov.Sub.03
  • NAFA Tape 3 of 6.mov.Sub.03
  • NAFA Tape 3 of 6.mov.Sub.03

It’s like Premiere almost does the right thing. It almost says “Hey, these six different clips really only refer to three different media files. So I’ll consolidate them and my operator will love me and buy me upgrades.” But it goes wrong. Because instead of naming the clips on the timeline the same as the source files, it names the the same as the first subclip from that source file it runs across. I guess. Or something.

Oh, and yes, it still throws all those other clips, which aren’t even technically on the timeline, into the bin.

Okay, well, you’d think that’d be annoying but fairly easy to work around, right? Just identify the clips in the bin that aren’t in the timeline, delete them, then relink the others to their appropriate media files. By hand. Because you hate yourself. Right?

Yeahno.

Because when I do just that, and believe me I tried, Premiere doesn’t seem to get the message. It still thinks the clips are offline. So I have linked master clips in my bin, and unlinked timeline clips in my timeline. Which really shouldn’t be possible in any modern NLE, but there we are.

Now, this is the part where I wish I could tell you I had a workaround all figured out, and it’s actually very easy, and you should try it. But I can’t. It’s after five o’clock, and I haven’t figured out a damn thing yet. All I’ve done for the past eight hours is identify the problems.

But I can tell you that whereas all day Premiere has been acting like a slightly brain-damaged puppy — enthusiastic but embarrassingly incompetent — Smoke has been all “Fuck yeah.” To wit:

Smoke screenshot

You see that right there? That’s what my screen looks like after I point to my AAF — the exact same AAF I fed to Premiere and watched it choke on — and say “import please.” Smoke looks at the AAF, figures out what the subclips on the timeline actually refer to, goes out and finds the media for me, and loads both the timeline itself and the source media. Boom, baby.

And lookit:

Smoke screenshot

Not only can Smoke count — if you look in the bottom left you can see the timeline starts at 01:00:00:00, non-drop — but check this out:

Smoke screenshot

It also kept my clip names correct, while still relinking them to the correct media files. That may not be entirely obvious if you’re not familiar with Smokes information-overload interface mode, but you can see the clip names — “NAFA Tape 1 of 6.mov.Sub.01” and such — at the top of each clip on the timeline, the clip name of the actual media file it’s linked to right below that, and down at the bottom the path on my drive where that source file lives. (It’s a soft-imported file, which is a bit like Avid’s AMA, if you know what that is.) Oh, and those “H” and “T” numbers on the top left and bottom right of each clip on the timeline? Those are how many frames of heads and tails I have. “Many thousands,” in both cases, because these clips are a whole camera reel’s worth, and they’re nearly an hour long each.

So my timeline came over correctly, all my clips came over correctly, they maintained their same subclip names so I can keep everything straight, but they kept only their names. The actual media files themselves are the originals.

That’s how it’s supposed to work.

Now, I grant you fully that I could just be doing something totally stupid here, and that’s why Premiere is being a little bitch to me today. I only started using Premiere a few weeks ago, and Avid just days ago, so what the hell do I know. But the fact that I can take the exact same AAF file — not the same timeline, but literally the same file — and send it to Smoke and have everything do what it should makes me think this isn’t my screwup.

So I ask you, faithful readers. What am I doing wrong here? Is there a way to send an AAF to Premiere and not screw up the timeline timecode, first of all, and second not have it go bananas when faced with the terrifying prospect of a subclip? Or am I just boned here? Drop me an email or find me on twitter or something, wouldya? I’m tired and frustrated.


Jul 12

How to be a cool company, and also something remarkable

Autodesk Smoke costs seventeen thousand dollars.

Let me say that again, because I suspect it might not have sunk in the first time: Autodesk Smoke costs seventeen thousand dollars.

And that’s just for the software. That’s not counting the beefy-ass Mac Pro with the superfast RAIDs and the Kona board and the broadcast monitor and all that. Once you figure in the hardware, a Smoke system can easily top forty grand.

So it better be shit hot, is what I’m saying here.

And you know what? It is. It’s shit hot.

I first started using Smoke — and let me be very clear here: I am strictly an amateur with it — back when it was called Fire and sold by a company called Discreet Logic. It ran on a Silicon Graphics Onyx, a five-foot-tall solid black supercomputer that required its own special power. The hard drives were SCSI, and there were dozens of them, packed into aluminum enclosures far too heavy for a strong man to lift and bristling with unfinished edges just waiting to maim the absent-minded and the incautious. The whole system, hardware and software together, cost upwards of half a million bucks, and it only did SD, and by God, we liked it.

Today it runs on your laptop. Not well, certainly not well enough to do any production work on. But it will in fact run on your laptop. How do I know? Cause Autodesk was generous enough to send me a non-revenue copy. For nothin’. Just ’cause I asked ’em if they would.

That’s a cool company, right there. They sell a product for northwards of fifteen thousand bucks, and just give copies of it away for free to people who express an interest. Because they think they’ve got a good product on their hands, I guess, and they want people to know more about it.

Which is a bit of an uphill struggle, you know? Smoke is, I think I can fairly say, a niche product. It’s not general-purpose. It’s not all-things-to-all-people. It’s not least-common-denominator. It’s a very precisely targeted solution for a very precisely defined problem: creative finishing and visual effects for broadcast. It’s not the kind of thing you’d buy to put star wipes over your wedding videos with.

So why I do I care? Simple: An educated person is an employable person. I haven’t had any serious stick time on a Smoke since the 1990s, and I’ve never done a paying job on one to date. But they are out there, in surprisingly large numbers, and the fresher and more up-to-speed I am on the system, the better my chance of landing a good gig on one when the opportunity comes along.

So I’m gonna be playing with Smoke for a while. On my laptop. Because it can run on laptops now. Which is one of those things I guess you’d have had to see it running on an Onyx to truly appreciate, but trust me, it’s kind of a big deal.


Jun 29

The FCP X FAQ, that petition, and what should have happened instead

So as everyone knows, Apple put some "Answers to your Final Cut Pro X questions" on their site this morning. And, well, to be honest it’s kind of insulting.

I don’t want to do a point-by-point here. It’s unnecessary and tiresome. Instead, I want to focus on just one thing. This is taken verbatim from Apple’s page:

Does Final Cut Pro X support external monitors?

Yes. If you have a second computer monitor connected to your Mac, Final Cut Pro X gives you options to display the interface across multiple monitors. For example, you can place a single window — such as the Viewer or the Event Browser — on the second monitor, while leaving the other windows on your primary monitor. Like previous versions, Final Cut Pro X relies on third-party devices to support external video monitoring. We’ve been working with third-party developers in our beta program to create drivers for Final Cut Pro X, and AJA has already posted beta drivers for its popular Kona card: http://www.aja.com/support/konaNEW/kona-3g.php.

Clicking that link takes you to a now-infamous document Aja provided last week talking about how Final Cut Pro X “supports” real-time monitoring. I put the word “supports” in incredibly sarcastic air-quotes for good reason.

The long story made short is that if you have a Kona board installed in your Mac Pro, and you have just a single graphics monitor attached, you can configure your Kona to act like a second graphics monitor. Then you can tell Final Cut Pro X to display an enlarged viewer on your second graphics monitor — which is actually displayed, thanks to Kona trickery, on your broadcast monitor: your BVM, your Cine-tal, your FSI, whatever you have.

Except what you’re seeing, when you do this little exercise, is not your actual frames. It’s an eight-bit, progressive-scan facsimile of your frames. As Aja say themselves in their how-to document, “the quality of the output produced during editorial should be considered preview quality.” Frankly, even calling it “preview quality” is a bridge too far. Working with interlaced material with the wrong field order? Tough. You’ll never see it. And that ugly-ass posterization you see in all your graphics? Yeah, it’s not really there. It’s an artifact of the eight-bit downconversion. And does your footage have the correct gamma curve applied to it? I don’t know. We’re talking about Aja here, so probably; those guys know what they’re doing. But we’re also talking about Apple, and those guys have conclusively demonstrated they don’t. So your guess is as good as mine.

Now, maybe that’s okay for you. Maybe you don’t deal with interlaced material (if you’re very, very lucky). Maybe you don’t work in more than eight bits per channel (if you’re stupid). If both of those are true, then you don’t really need more than what FCP X can give you right now. If they aren’t, then you’ll simply have to wait and hope for the best.

But there are two larger points to be considered here. The first is that Apple didn’t ship the software with support for broadcast monitoring. The product isn’t ready if you can’t see your pictures, guys. It’s incomplete. It’s not yet done. It’s a potential product at best. Apple doesn’t seem to get that. They seem to think monitoring is something most people don’t care about. And maybe they’re right. But the product has the word “Pro” in the name, which implies a level of commitment beyond “Eh, only the really high-end guys need that feature, it can wait indefinitely.”

Which brings us to that petition that’s been going around. I’ve been contacted by three of its authors — or at least three people claiming to have been authors of it, who really knows or cares what the truth is on that point — and I’ll repeat here what I said to each of them privately: It goes too far. The language is too whiny, too butthurt. “It’s not professional,” the petition says. “It’s prosumer-grade.” And then the petition demands that Final Cut Pro X should be “considered part of the iMovie family or labeled a ‘prosumer’ product.”

Guys, due respect, but it’s none of your fucking business what Apple calls their products. They can call it pumpkin pie if they want, and you can’t do anything about it. Branding and positioning are not open for public scrutiny, and making a lot of noise about how the program is or isn’t advertised just reinforces the idea that a bunch of insecure self-appointed “professionals” are feeling threatened by democratization and the lowering of barriers to entry.

Nobody gives a shit whether you consider yourself to be “professional” or not, or whether you think anybody else is “professional.” All anybody cares about is your craft. If you’ve got talent as a storyteller, great, awesome for you, shut up and edit. If you don’t and you’re just a computer nerd, then kindly fuck off back to your basements to play with your expensive toys.

If the industry as a whole had wanted to make a demand of Apple, they — we, I suppose — should’ve done so in a reasonable and fair way. If it’d been up to me, you know what I would’ve done? Just this: Apple, announce that Final Cut Pro 7 is end-of-lifed. Put it back on sale. Commit publicly to continue selling it, and providing direct support and compatibility and bug-fix updates for it, until July 1, 2013. That’s two years, rounded off a bit. Two years is an entirely reasonable time to continue supporting an end-of-lifed product that your customers depend on in their businesses. Come right out and say, bluntly, “We will not add any new features at all to Final Cut Pro. We’re giving it the minimum necessary attention for the minimum reasonable time.” We know how to cope with that. It’s been done a million times in this industry. It’s fine. And it means that anybody who uses FCP 7 now can continue to do so with confidence while they transition off of it. You can even turn it into a marketing opportunity: “We think Final Cut Pro X is the way to go. But we understand it’s not there yet. So consider it as your next-generation NLE of choice, as you make your transition off Final Cut Pro.”

And if you really wanted to be cool, Apple? I mean really revolutionary in the industry? You’d give away a copy of Final Cut Pro X to everybody who has a valid Final Cut Pro serial number. “We value the customers who’ve chosen Final Cut Pro. We want a chance to convince you that Final Cut Pro X should be on your radar. Here’s a code to download a copy for free on the App Store. Please let us know what you think.”

That’s how Apple could have turned this from a PR calamity into an industry-shaking triumph: By saying FCP X is a finished product and ready to go for some customers, and a preview of what’s to come for the rest. They’d have gotten good will in spades, not to mention recruited thousands of existing high-end Final Cut Pro users to be the world’s largest focus group.

If somebody, last Tuesday afternoon, had put that into a petition, I’d have signed it in a heartbeat.

And yes, I’m acutely aware of the fact that I didn’t do it myself. I know, I know, I’m an acknowledged genius with a once-in-a-generation mind, but even I’m not Superman.


Jun 28

A story as yet untold

I’m going to let you guys in on a little secret.

On Sunday, after spending a week on-and-off with Final Cut Pro X and pretty deep in a funk about it, I made a little thing. I did it because I was inspired in a small way, but I also did it because I had a copy of Adobe Premiere Pro CS 5.5 here, and I wanted to try it out.

The next day, yesterday, I wrote up some unstructured first impressions of the program. My overall conclusion: Meh. It was fine. Nothing exciting, but a capable alternative to FCP 7.

None of that is the secret yet. Here’s the secret:

Yesterday I was contacted by no fewer than three Adobe employees, including Steve Forde, Adobe’s senior project manager for After Effects. He sent me a particularly pleasant email, and asked if there was a convenient time when we could chat on the phone for a bit.

Flash-forward to lunchtime today, when my phone rang. It was Steve, and also Al Mooney, Adobe’s product manager for Premiere who blaaaaaghs.

I want to pause here for a moment.

I’m nobody. I’m not a famous personality, I don’t run a large post house, I’m not even a potential Adobe customer, because I already own their software. But just one day after I wrote about my ten-hours-old impressions on a random blog that nobody reads, I was on the phone with the people responsible for both of the products I wrote about.

That’s remarkable. In this day and age, when post production has been increasingly commoditized and, yes, democratized, to get a call from the people who make the product you’ve been trying out is astonishing.

Story isn’t over yet, though.

I talked with Steve and Al — who turned out to be just remarkably nice, funny guys — for just shy of an hour. We talked about some of the technical details of their products, including how Premiere handles rendered timelines and real-time I/O and how the Premiere-After Effects workflow is evolving. They asked me my opinions and, after being reassured that they really did want me to be candid and direct, I offered them.

At one point, I used the phrase “apocalyptically bad.” At another point I told them I’d rather chew off my own foot than use their software for a certain task.

And through it all, they laughed and joked and returned my candidness in kind. It was truly an astonishing conversation.

This afternoon I emailed them both and asked if it would be okay if I spoke publicly about … well, everything I just finished telling you. The way they reached out to me and talked with me, I said, was truly surprising, and I thought it deserved to be acknowledged publicly. I thought, to be blunt, they deserved some good PR for their efforts.

But that’s not really the whole reason why I wanted to talk about this. Don’t get me wrong; it’s true. I appreciated their time — and the time they’ve both committed to spending with me in the near future — and thought they should get something out of it. But really, what I want to talk about here is contrasts.

A few hours ago, visual effects legend Ron Brinkmann went on the record with some thoughts about the whole Apple-Final Cut Pro X debacle of the past week. In his blog post, he told a story about a meeting between Apple execs and Hollywood industry professionals about the recently-acquired Shake. Ron described Steve Jobs, who was at the meeting, as telling the pros, “the relationship between them and Apple wasn’t going to be something where they’d be driving product direction anymore.”

That’s how Apple does business. The company does what the company wants, and damned be anyone who tries to tell them different.

And you know what? That’s fine. That’s great, even! We need that kind of singular vision in the world. And after a decade of great products and jaw-dropping commercial success, Apple’s certainly earned the right to tell everybody to get fucked.

But then again, sometimes it’s good to have a vendor who treats you like a partner, rather than a customer. Sometimes it’s good to work with somebody who listens more than they speak, somebody who gets where you’re coming from and wants to help you get where you want to go, rather than telling you where you should go.

Sometimes you need a leader in the industry to set the tone and show the way. And sometimes you really just kinda need a comrade in arms.

Apple is a visionary company, and they’re in the privileged position of being right nearly all the time. Adobe is, for all its girth in other verticals, hungry in the post industry. They’re not content to fight for third in the NLE market. They’ve got their eyes on first place. And they’re acutely aware that they don’t have what it takes to get there yet.

When Apple wants to do better, they lock a bunch of brilliant people in a room for two years.

Meanwhile, Adobe’s calling up random editors to talk to them one-on-one for an hour about how their products can be better.

It’s just an interesting contrast, is all I’m saying. It’s one worth paying a little attention to, if you ask me.


Page 1 of 2