What can interactive livestreaming mean for the future of live theatre?
Originally published at Arts Professional in September 2017.
Serendipity, Story Universes & Theatrical Reality
In October of 2016, I was invited to join Freedom Studios’ stable of associate artists as their first technologist-in-residence. I’d been a part of a team helping Bradford bid to host 2018’s Great Exhibition of the North, along with Freedom’s co-artistic director, Alex Chisholm. Alex and I shared an interest in science fiction and the impact of digital technology on art and culture; a little serendipity led us to a deeper collaboration. Our hope was to use my residency to explore the impact of various emerging technologies– virtual reality, augmented reality, storytelling platforms, open data — on theatre and on live performance.
Freedom’s then upcoming production of Tajinder Singh Haver’s North Country — a post-apocalyptic tale set in a future Bradford recovering from a plague — offered a rich, timely and provocative story universe upon which we could assemble a few digital experiments.
North Country is essentially a series of direct addresses from three central characters to the audience, taking place across several decades and in various real locations. This story seemed readily re-mixable by space, by place and by perspective, hopefully extending the theatre going experience with new narrative technologies.
Our first experiment involved restaging a scene using a 360º camera from the point-of-view of an unseen, off-stage character. Providing this novel perspective on the story outside the live performance itself, provoked some interesting questions, some of which we explored in a piece for Medium on Theatrical Realities.
Storycasting = Livestreaming + Interaction
As the company began to consider options for touring North Country, the possibility of restaging the play for a digital audience became increasingly compelling. Alex and filmmaker Zsolt Sandor began to speculate about capturing and streaming a live performance, using a service like Periscope.
One possibility included orchestrating a flash-mob, providing multiple perspectives and views to audiences watching on their smartphones; another involved a performance where each character’s perspective would be available as a discrete stream, with viewers able to follow and switch between characters at will.
Eventually, we settled on the latter, but rather than the audience, Alex and Zsolt would retain control of the narrative flow. This would mean we could proceed without altering the actors’ performances or the script too much, as well as retain the play’s narrative pace.
In contrast to NT Live’s streaming of theatrical performances, we would target the mobile audience of Facebook Live, building on Freedom’s own social media following. Actors would film and light themselves using smartphones; a director would switch between those phones from another device; we’d stream live via 4G or wifi to Facebook and our audience would join from their own phones or computers.
And so, our notion of Storycasting was born, architecting and directing a story from multiple livestreams, locations and time periods. In essence the architecture of a story could be livestreamed, with others assembling them into narrative — in this case by the play’s directors, but possibly by viewers in the future.
Location Scouting & Live Editing
North Country was originally staged as an immersive “theatre-in-the-round”, designed by Uzma Kazi in a recently abandoned Marks & Spencer. For our storycast to be similarly immersive, we chose a derelict building in the city’s Little Germany quarter; though fire-damaged, full of dead pigeons and lacking power, broadband and heat we found it offered a range of ready-made dystopian looks!
…though fire-damaged, full of dead pigeons and lacking power, broadband and heat we found it offered a range of ready-made dystopian looks!
With the lead time for a fixed-line broadband connection running to several weeks and no locally accessible wifi, we were reliant on mobile data for live streaming. Fortunately, local 3G and 4G signal coverage was strong and a quick test livestream to Facebook Live from various points in the building worked well.
Zsolt later sourced Switcher Studio, an app that turns iPhones and iPads into a live editing studio. Using a trio of iPhones as remote cameras, we fed their footage into an iPad, from where we could switch camera angles at will, livestreaming the combined footage to Facebook via wifi hotspot provided by a 4G phone.
Each character would be delivering their dialogue straight to the front-facing camera of each phone and not only could we switch between each actor, but also select either the front or rear camera, giving us the possibility of showing the character’s point-of-view.
Complications & Choreography
We planned to move the actors through up the building’s five storeys as the story progressed, but our 4G wifi hotspot would only work if all other devices were on the same floor, so our crew would have to follow the actors throughout the location, being careful to stay quiet and out of shot of all three cameras.
By default, Switcher Studio’s remote iPhone camera feature could only capture footage in landscape mode and not portrait, making the phones more awkward for actors to hold for long periods in one hand. With each phone running continuously for over an hour, they also needed a portable USB battery attached, further weighing down the phones.
Our “switcher” iPad seemed to be the only device recording sound — the remote iPhone cameras weren’t sending their sound to the switcher to be mixed. After advice from the app’s developers, we were told this capability was in development, but we would need to mic each actor and mix the output into a separate audio mixer (an iRig PRE) that would output to our “switcher” device. As such, we placed radio mics on each actor and had Zsolt monitor sound levels as he followed the actors and crew through the building.
At several points in the story, an additional crew member was used when all three actors were required in the same shot, with some very clever and rapid hand-signalling from each actors to signal to the camera operator when they were about to speak! Natalie Davies’ character also had two very quick costume changes to make in between shots of the other two characters!
Though our early challenges were largely technical — ultimately, the larger challenge was of choreography — smooth camera transitions between characters, keeping the crew out of shot, moving through the location – and doing everything live! Fortunately, several complex rehearsals along with a brave cast and crew meant the final performance went without a hitch (one crew member was briefly glimpsed in the final shot!
Story Components & Streaming
Though “North Country“ was uniquely suited to this form of delivery, the central question that North Country Live raised was whether by streaming all the component elements of a performance (characters, places, time periods), could we allow viewers to reassembled them on their own terms, in a kind of participatory livestreaming?
We’re already starting to imagine a future iteration of North Country, where viewers can select viewpoints, following characters, locations and timelines of their choosing, perhaps even contributing and curating their own story elements. Indeed, we’re already considering the possibility of a ‘storycasting’ kit for performers to adapt existing works or perform new pieces.
We believe this kind of storycasting of an architecture and universe of content offers some compelling paths for innovative storytelling in theatre beyond simply streaming and perhaps to a more open-source remix of live theatre.