Go backBack to selection

FCP X and the Future of Editing

Editing is older than motion pictures. The ordering and pacing of dialogues, scenes, entrances and exits to build conflict and resolution have long defined Western theater, from Aeschylus’s Oresteia to Wagner’s The Ring of the Nibelung [Der Ring Des Nibelungen]. It was the insertion of first-person thoughts into dialogue and plot that modernized 18th- and 19th-century novels and clever sequencing of mechanically animated magic lantern glass slides that thrilled Victorian audiences to popular epics like Ben-Hur.

Nevertheless, as Walter Murch likes to point out, film editing was invented 14 years after motion pictures. Uncut reels of onrushing locomotives, sneezing and kissing were gripping, profitable entertainments, such were the frisson and novelty of realistic moving images. (Vitascope was the YouTube of its day in this regard.)

It would take Georges Méliès’s camera to jam and restart in a Parisian street in 1896, and a similar mishap at a 1901 horse race in Bristol, England, to reveal, respectively, the trick of editing within shots (effects) and between shots (cutting). Disruption of time, place and point of view through film editing would soon yield a new dramatic art: Cinema. It could as easily have been called Cubism.

As film syntax matured, technique formed and a profession emerged. A century on celluloid carried us from scissors and glue to guillotine tape splicers and upright Moviolas, then flatbeds. Video brought “big iron” linear editing systems for online, then finally PC-based non-linear editing systems (NLEs) — all mostly operated by professionals, at least through the late 1990s when hardware-based Avids still cost tens of thousands of dollars.

What cracked the door to editing for “the rest of us” was the introduction of software-based Final Cut Pro 1.0 at NAB in 1999, coincident with the arrival of FireWire-enabled DV camcorders. All dismissed as amateur, naturally.

We’ve seen how that turned out. Digital democratization spread inexorably. It’s not hard to draw straight lines from FCP + DV to HDV and DVCPRO HD, to the rise of small camcorders, Internet streaming, cheap SD cards to record on, RED usurping film and cheap HDSLRs usurping RED. (With a regrettable drop in pay rates along the way.)

As a consequence, there are exponentially more people, professional and nonprofessional alike, of all ages, in all countries, now creating, editing and distributing digital movies. Everyone with a point-and-shoot or smartphone in their pocket is a potential HD source. (Today’s equivalent of a Kodak Brownie: “You push the button, we do the rest.” Insert irony here.)

And yet, though we’ve embraced digital code as the motion picture medium of our time, the technology of nonlinear editing remains very much a work in progress.

For one thing, picture, sound, music and effects continue to invite the forces of invention. For instance, we now need torrents of “metadata” — data about data — simply to keep track of everything. Not only during editing, but to manage future access and archiving.

And another thing, nonprofessionals — whatever this distinction signifies in an information age rife with underemployment — have vaulted forward in technical savvy and technique thanks to the explosion of shared knowledge on the Internet, plus the extensive capabilities of their low-cost digital tools.

As an NLE designer today, where would you draw the line between professional and nonprofessional? Which features would you include or deny? Wouldn’t you wish to meet the high-end needs of the workplace, yet attract that vast center of the bell curve of potential users? If only to carve out the largest possible market share?

As it happens, this past year ushered in a crop of powerful, affordable, newly 64-bit professional NLEs, including Final Cut Pro X, Avid Media Composer 6.5, Adobe Premiere Pro CS6, Sony Vegas Pro 12, Grass Valley Edius Pro 6.5 and the resurrected Lightworks. (Media 100 Suite remains 32-bit. Another dozen Windows-based NLEs exist for under $100.)

64-bit architecture introduces dramatically faster importing, transcoding, rendering and output. It demolishes the old 32-bit performance barrier of 4GB RAM, replacing it with a theoretical 17 billion GB, just in time to meet the coming decade’s demand for rock-solid stability, instant timeline loading and flawless playback of real-time effects in H.265 (twice as efficient as H.264), 3D, 4K and beyond.

Details vary, but on the whole these NLEs offer the latest camera codecs; codecs for proxy editing and finishing; timelines that accept mixed codecs, resolutions and frame rates; motion effects; image stabilization; primary and secondary color correction; audio mixing; effects plugins; sophisticated titling; support for third-party hardware (Matrox, AJA, Blackmagic, MOTU, Bluefish); support for multicamera editing; support for stereoscopic 3D editing; extensive metadata tagging of clips; media management across myriad drives and sources; output compressions; and project/timeline interchange with other apps, NLEs and audio editing programs.

All you need is a credit card, not a guild card. What’s not to like?

What these NLEs don’t aspire to, with the exception of FCP X, is evolution. Despite constant churn in the technology of creation and consumption of digital moving images — viewing now entails phones, tablets, laptops, TVs, cinemas — editing hardware remains tied to a mouse-driven desktop environment conceived decades ago.

On the software side, why perpetuate dual source/record windows from 1970s tape editing, or interface metaphors adapted in the 1980s from film editing (Avid Media Composer), or 1990s timeline design (Final Cut Pro 7)? Why not exploit this 64-bit great leap forward in speed and processing to rethink, perhaps even reinvent, editing for the coming file-based century?

In introducing its mobile operating system, iOS, five years ago, Apple seized an opportunity to innovate new file systems (hidden), control interfaces (touchscreen), gestures (multitouch), screen displays (full), app switching (fast) with saved states (flash memory), Internet upgrades (App store), and voice commands (Siri). And let’s not forget erasing pixels (Retina display).

iOS is an offshoot of OS X, 32-bit but written in Objective-C like OS X. Both operating systems possess a layer-cake architecture with a dedicated media layer that contains graphics, audio and video “frameworks” such as Core Animation (fluid icons, controls that fade), Core Audio and Core Media. Frameworks are collections of functions that can be shared by different apps in a modular fashion, without having to be rewritten each time. With iOS 4 (2010) and OS X 10.7 Lion (2011), the media layer of each OS gained a new framework, AV Foundation — the engine of FCP X.

A big advantage of conjoined operating systems is that user-interface breakthroughs on mobile devices such as the iPad can readily migrate to Mac apps like FCP X — for instance, use of animation, multitouch, auto-saving, full screen display, Retina display, integration with flash architecture — all of which in turn optimize FCP X for use on portable MacBook Pros with trackpads. On the latest MacBook Pro with Retina display, for example, you can view full 1080p in FCP X’s small Viewer window.

Of particular significance: the 64-bit AV Foundation found in OSX supplants the now legacy 32-bit QuickTime framework (video files will continue to sport QuickTime extensions). AV Foundation brings, at last, multi-core and GPU-assisted speed to Final Cut Pro rendering tasks (using OS X’s Grand Central Dispatch and OpenCL), as well as full color management from input to output and finer time accuracy for subframe events.

Of course the broad gibe against FCP X at its introduction in June 2011 was that it represented nothing more than a pro version of iMovie, which, not surprisingly, also relies on AV Foundation.

Apple senior vice president of Industrial Design Jony Ives is a devotee of German designer Dieter Rams’s “Less, but better,” philosophy, evident in all Apple products. FCP X chief architect Randy Ubillos was the creator of the first three versions of Adobe Premiere in the early 1990s, while senior product manager Steve Bayes, a working editor for years, was once Avid’s principal product designer for Media Composer, Symphony and DS Nitris. He also wrote the essential The Avid Handbook. As futurists endeavoring to envision the shape of tomorrow’s pro editing, they’re not exactly chopped liver.

So why the virulent public protest?

In addition to incorporating OS innovations and building out extensive control of metadata and media management, the FCP X team sought to directly address several prominent trends in production: Digital cameras generate endlessly more footage than film cameras ever did, which must be readily reviewable and searchable. Multiple cameras are now common and often wild (no sync). Democratization encourages many to edit regardless of experience; at the same time, audiences expect perfect finished quality regardless of budget.

FCP X’s solutions, in order: fast Skimming with pitch-corrected audio, Keywords & Smart Collections, Multicam (introduced in January in FCP X’s third upgrade in a year), and a friendlier, less cluttered interface for those with less experience, with deep controls located just below the surface for experienced editors.

The uncluttered interface is key to understanding how radically innovative FCP X truly is. Conventional timelines resemble orchestral scores, with dozens of staffs representing myriad instruments and sections, each charted across time. In a conventional NLE timeline, video and audio tracks can similarly number in the dozens, overflowing even the largest display. In many cases, these tracks are mostly empty, containing only a handful of clips. Arguably, a massive waste of precious screen real estate is the result.

FCP X has no tracks. It adopts a different metaphor, one that Aeschylus would recognize. Instead of a timeline with tracks above and below, FCP X provides a single “primary storyline” that serves as a narrative spine, with a beginning, middle and end. Individual clips are “connected” at points along the storyline, floating on, just above (video) or below (audio) the storyline. A complex stack or sequence of clips can be collapsed and nested into a simple “compound clip” that can be edited like a single clip or momentarily reopened into its own storyline for internal editing.

Sync relationships are preserved by a “Magnetic Timeline.” Since clips and compound clips are attached to points along the storyline, it’s impossible to knock them out of sync in the course of inserting or deleting other clips. If two clips happen to collide in the course of an edit, one slips above or below the other (literally, using animation), preserving the relationship of both clips to the storyline.

The editor, free from worry about accidentally knocking clips or complex sequences out of sync, can playfully shuffle clips and sequences, focusing entirely on story structure.

Dispensing with the clutter of conventional tracks also favors use of FCP X on mobile devices and compact laptops with smaller screens — a clear nod to the future.

When 64-bit FCP X displaced 32-bit FCP 7, which was summarily discontinued, there’s no question pro editors whose livelihoods depended on FCP 7 were deeply shaken. But many experienced editors groused because, I believe, FCP X was strangely unfamiliar territory. Others faulted FCP X for features that were, at first, missing, instead of lauding innovations like Magnetic Timeline and the fact that, with OS X 10.7 Lion’s autosaving and Resume, a power loss or unlikely crash no longer means loss of work. FCP X projects reopen exactly where they left off, like magic.

Rome wasn’t built in a day and neither was what became FCP 7 (10 years). In the course of FCP X’s first year, five free upgrades have arrived via the App Store (no need to change out of your bathrobe), including support for Multicam, XML, media relinking and broadcast output. Features to arrive later this year include multichannel audio editing tools, dual viewers and support for MXF plug-ins and RED. And then there are useful OS features like voice dictation, which arrived in July with OS X 10.8 Mountain Lion. Double-click the function key and you never have to type an event label or clip description in FCP X again. Just speak.

Concision, after all, is the soul of editing, and the film-editing project begun a century ago with scissors and glue may yet reclaim its own simplicity. What we really want, if we’re unashamedly honest, is facility of editing with the ease of dreaming. To say, “wouldn’t it be cool if…” and see instant results on the screen. Maybe someday we can tell Siri where to make that cut and how long to extend that dissolve. “Change the tint, add vignetting, a little more saturation…”

With FCP X, we’re taking baby first steps in that direction.

Final note: I cut a 17-minute documentary one evening last summer (footage I shot) using FCP X on a 17-inch MacBook Pro with an internal 500GB SSD and fast G-Tech 8TB RAID with Thunderbolt. I loaded files, reviewed footage, cut picture, sound, music, added titles and credits, and finished in 11 hours straight. There was no initial rough cut, then fine cut. I edited carefully along the way, with utmost precision. The finished results were projected before an audience the following morning. I couldn’t have pulled this off using pokey old FCP 7. In other words, if this is the future of editing, I’m loving it.

Editor’s note: On October 23, after the publication of this article in our Fall, 2012 issue, Apple updated Final Cut Pro X with version 10.0.6. The following features were among those added:

* Multichannel audio editing tools
* Dual viewers
* MXF plug-in support
* RED camera support
* Chapter markers in the timeline
* Multiple range selections
* Flexible Clip Connections
* Freeze frame
* Drop shadow effects
* XML 1.2

For a complete list of new features, visit Apple.

© 2024 Filmmaker Magazine. All Rights Reserved. A Publication of The Gotham