Notes by Humberto Rodriguez, Technical Director at Signal Space Lab
When we began creating cinematographic experiences for virtual reality, we saw a fundamental flaw in the available solutions on the market for introducing action and motion to room-sized scenes. Compared to outdoor VR footage, most of the VR content in room-sized spaces usually set the camera statically in the middle of the room. This static type of setup helps production teams to avoid having challenges with stiching in post-production but limits the content to a static camera placement. In the experiences that already have camera motion in room-sized environments, for example Gone and The People’s House, the movement relies on dollies which limits the camera motion, in the sense that it cannot fly over objects anchored to the floor – tables, chairs, and almost every piece of furniture within a home.
With Afterlife our goal was to overcome this limitation and to accomplish a sense of more dynamic movement. With this in mind, we proceeded to hunt for further solutions to this problem.
We knew from the beginning that our best bet would be to apply technology already in development and use it to our advantage. This is why in our research we found interesting projects, such as the Hangprinter, that became inspirational tools to customize our own automated camera motion system specialized for VR filmmaking: The Crane.
A new solution for camera motion on a VR filmmaking set
Signal Space Lab’s Crane is controlled by three basic systems each with specific tasks to allow six degrees of freedom (6DoF) camera movement: one system to move the camera, the second one to retrieve in real-time the camera’s position in space, and the third one to log the desired path of movement using a separate remote controller.
Throughout the development of the Crane, we wanted to give the camera the ability to cover most of the room’s space with a precise and reliable movement. To achieve this, we added both a positional system to track the camera and also mounted four anchors to the walls of the room with strings to hold the camera in mid-air. The strings are fed through the anchors depending on the predetermined camera path. These anchors are controlled by the motion system which is in constant communication with the positional system. The positional system feds the motion system with the camera’s position coordinates. The motion system makes a comparison between the retrieved and the original data, and then corrects the path accordingly. The camera path is pre-defined, meaning that now on a VR set we can access a tool that facilitates recording multiple takes of a continuous performance.
After exploring how these systems work together, we found that the optimal interface with our director Luisa Valencia was to give her the freedom to physically move the remote control rather than transcribing the coordinates into the crane system. With the remote control, the director can determine both the acceleration and path of the camera desired movement. In a way, we are giving directors a tool to draw their vision in 360 live-action.
Particularly designed for Afterlife’s production demands, we have in our hands the first version of a scalable production tool for filmmakers in the cinematographic VR space.