Tribeca 2016: Making Guy Maddin’s Seances Interactive
Watching Guy Maddin and Evan Johnson’s latest work Seances feels both familiar and utterly strange. Born from the knowledge that over 80% of silent movies have been lost, Maddin and his collaborators at the NFB wanted to resurrect as many titles — both real and invented — as possible: first in 2012 in production sessions at the Centre Pompidou in Paris and the Phi Centre in Montreal that were open to the public, then last year in the feature film The Forbidden Room, and now in an interactive version called Seances that premiered at the Tribeca Film Festival’s Storyscapes event and is currently available online. The original footage — reportedly 4,000 hours of rushes pared down to 1,000 hours of finished product — evoked silent era adventure films in its original production (check out Scott Macaulay’s interview with DP Ben Kasulke), and, particularly in the digital decay effects that were added onto it afterwards, felt like a Bill Morrison project concocted from the forgotten vaults of defunct cinematheques.
The individual scenarios the team produced are short-film length, and The Forbidden Room is structured as an intricate Russian doll, with layers and layers of vignettes embedded within each other to tell, more or less, a coherent story, although the feeling of getting a glimpse into resurrected films is much more important than any narrative clarity. With Seances, the finished film is generally between fifteen and twenty minutes long and the images are privileged over story to a much greater degree: viewers are, in essence, summoning the spirits of films past, and while we get glimpses of partial narratives blending into each other there’s no pretense of comprehensibility. Seances is meant entirely to evoke a mood rather than tell a story.
At Storyscapes about twelve viewers entered a small black theater and gathered around a large horizontal touchscreen monitor — a digital cross between a ouija board and a crystal ball — where thumbnail images from different scenarios swirled around in a circle. By swiping images away viewers could remove clips they didn’t want to watch, and by touching and holding onto them they could select clips to include. When enough clips were selected to comprise a finished film viewers ended their digital scrying to take their seats and watch what would be conjured up. My own screening was titled The Story of the Turgid Spades, and featured scenes — some of which I recalled from The Forbidden Room — such as a squid thief fleeing from a volcanic island, a submarine crew eating pancakes, and a female inmate of an apparent mental institution telling her life story to a young girl. The clips aren’t just played sequentially; rather, complex software from Nickel Media combines them in unique ways with real-time rendering. The result is that each and every film created is utterly unique, and, in the spirit of a seance, is neither recorded or ever created again: once it has finished it is once again lost, and over 2,500 such short films have already come and gone. If the real-time compositing weren’t impressive enough, Nickel Media has also created a way for this to be done in the cloud, allowing an online version of Seances to potentially create thousands of new films simultaneously for individual viewers around the globe.
To find out more about the planning and the technology that went into Seances‘ interactivity, I spoke individually with the project’s co-director Evan Johnson and with Nickel Media founder Jason Nickel, who told me about their collaboration and the workflow that went into turning a series of linear film clips into an indefatigable fountain of new work.
Seances will next show at the Phi Centre in Montreal, and the online version is available here.
Filmmaker: The idea to resurrect these old lost films in an interactive format was baked into the project’s DNA from the beginning, but why was it conceived in that way? What did it add beyond a linear version, like we saw in The Forbidden Room?
Johnson: From the beginning, our interest was in learning what kinds of surprising effects — narrative, visual, aural, and any other — could be produced by the accidental juxtapositions of a partially “randomly” assembled film. This was done with a naive faith that whatever ludicrous or unpleasant or fortuitous collisions occurred in the resulting film, a narrative would remain intact, provided there was some kind of audience there to receive it; and this faith is probably a faith in the basic logic of film montage and the way it almost automatically produces meaning through the succession and accumulation of shots in a viewer’s brain.
Filmmaker: Once you’d shot the footage, the first step as I understand it was for you, yourself and editor John Guerdebecke to manipulate (decay) it visually in a few dozen interesting ways. What was that process like?
Johnson: The manipulation of all the raw footage was, I suppose, at first just a result of the panic and self-loathing that comes from viewing one’s underwhelming footage after a disorganized shoot. But it quickly grew into something more interesting, in that it became another way to produce accidental, possibly surprising effects, this time in the form of visual “errors” both digital and analogue. Because Galen and I “decayed” ALL of the raw footage, not just the material that made it into fine cuts, the end result was a kind of palimpsest of accidents — accidents (boom mics, flubbed lines) in the footage as we shot it, accidents in the degrading of it (light leaks, dirt, stains, burns), and accidents in the random assembling of it. I should add that though editor John Gurdebeke loves to use accidents and throwaway shots in his edits, he is an almost unbelievably precise and detailed editor. Similarly, Galen and I are fairly obsessive about colour and texture, and Guy is a secret control freak when it comes to lighting.
Filmmaker: So once the footage was all prepared, what was your workflow and creative relationship with Nickel Media like as you introduced the interactivity?
Johnson: Well, first of all, the interactive concepts and the metaphors that guided the way they would be implemented were developed by us with the National Film Board [of Canada], particularly with our producers Dana Dansereau and Alicia Smith. Once it actually came time to build the mechanism that would do all the work [assembling our footage into randomly generated films], it was a matter of getting the content to Nickel Media, and the whole thing, from our end, was almost absurdly convenient, in that basically everything we wanted they were able to find a way to do — without fuss, I might add. Given that we sometimes asked some pretty strange things of them, it’s amazing how quickly and effectively they were able to invent solutions to problems that had never existed until we’d contrived them for this project.
Filmmaker: The interactivity — and the novelty of each film it produces — lies in the ordering, juxtaposition, and transitions between segments. How did you go about ironing this out so that each complete film flowed organically, or did that part of it come together relatively easily?
Johnson: In some ways this was a matter of all the filmed material simply “feeling” like it comes from the same world — a murky, silly, melancholy, sexually-anxious, self-destructive world full of malfunctioning characters, bad ventilation, and disintegrating plots — so that when it comes time to transition from one story to another, there isn’t a huge disruption in the atmosphere. Of course, not all the unique, randomly generated experiences will flow as well as others: some will feel clunky and confused, some will seem to start somewhere and end up nowhere, others will feel entirely like a bathos-producing accumulation of endings. Having said all that wishy-washy stuff, getting it to work was a matter of trial and error — good old fashioned hard work!
Filmmaker: Can you describe Imposium? I’m interested in how it functions in the cloud using the Amazon AWS architecture, as well as how the compositing software itself works. The real-time rendering is quite remarkable. What projects have you used it on previously?
Nickel: Imposium is a dynamic video-rendering system that provides the ability to code a story with algorithmic storytelling. It uses data from different sources to create a compelling story. For Seances we looked at all the short films they shot as the primary data source and plugged that content into a dynamic formula to figure out what sequences to put together. Imposium is able to fuse together video content with content coming from any data source like social media, user-inputs, or product catalogs to compile videos that are more personal for the viewer.
For Seances, users were able to control what clips they select and contribute to the changing film title on screen. Those selections would then contribute to what film was provided back to them. In this project, it was an indirect set of logic that built each of the films.
The idea to build server-side video rendering technology came from our success with a dynamic video called Take This Lollipop. It was the first personalized horror video and it ended up winning an Emmy for its innovation. Since then, we have been evolving this method of personalized immersion and built a robust system that gives us the power to pull any data and render any video we can imagine. It’s still a fairly new technology, but in the short time we’ve used it we’ve built ad campaigns for Activision, Google, and most recently LetGo where we let people create Hollywood-style commercials for any item they are selling online.
Filmmaker: So in what way was Seances unique? What was the development process — including any unexpected challenges or triumphs — like?
Nickel: Our team was approached by the NFB about a year and a half ago. The original direction was to deliver the video sequencing in the browser, but that provided technical challenges, especially on iPhones because that control is very limited. We had been working on the Imposium platform at the time so we knew it wouldn’t be a problem to handle this as our system actually renders real videos which are playable on all devices with full 1080p quality.
With Imposium already built, we were able to focus more time on the creative rather than worry about technical limitations. This allowed us to play with different rendering effects like datamoshing, where YouTube videos would appear and move the film’s pixels across the screen in an unexpected, surreal way. By having the power of these effects at our fingertips, the creatives were able to come up with technical concepts that were in line with their vision of conjuring videos and the fleeting nature of film preservation.
The biggest challenge we had was managing all of this amazing content. With all of the short films Guy, Evan, and Galen provided, we had to have a system that could keep things organized and preview the output so we could tweak the storytelling formula based on what results we were seeing.
Filmmaker: Can you explain a bit more about how the cloud hosting process works? How do you go about harnessing over 500 virtual machines, and how much video could they actually render at any given time?
Nickel: The Imposium video rendering system was built as a fully scalable solution in the cloud. So whatever number of videos are being rendered at one time, we can assign the proper number of servers to accommodate that load. The most number of videos we have rendered in one day was one million, which we did for Activision’s game Destiny. So we know Imposium has been load tested to handle as many renders as we can throw at it. We did a project for the Super Bowl and had 5,000 servers running at once to make sure we could handle the demand. We’re continuously making optimizations and are rendering faster and requiring fewer machines than ever.
Filmmaker: Since you’ve created this software for video/audio compositing and compilation, I’m curious if you have any advice or other thoughts for filmmakers about how they can best design interactive projects and particularly work well with coders.
Nickel: Imposium was designed to provide filmmakers with the ability to think beyond linear storytelling, but in a very powerful way. We’re discovering a new way of reaching viewers by creating something specifically for them, or their interests. So when a user watches your film, it now can be a unique experience hyper-targeted to that user’s interests. And they can just sit back and enjoy it without having to overthink the engagement. The story they get can now be programmed with an algorithm to pick out the most ideal content that pertains to them. With the vast content, and open-ended storytelling method used in Seances, we were able to put together a film that was conjured for each individual person.
For filmmakers working with Imposium, they could do things like shoot mini-stories that can work together, based on story, content, or style. From there, the footage would be tagged with data so we would know what to pull in when writing the algorithm, which decides what clips to put together based on a unique set of rules that drives the narrative. The system is a little like a programmatic version of video editing software, but different because of the infinite sense of direction unlike anything else out there.
Expanding on that, we will continue working with filmmakers to write stories that will be hyper-targeted for audiences based on melding content with formulaic algorithms to deliver compelling stories. To put it another way, we make cool stuff.