The Leviathan Project was a VR / AR research project I worked on as the Senior Research Associate at the USC World Building Media Lab with students and Lab Director Alex McDowell. The research is inspired by the steampunk tale Leviathan penned by author Scott Westerfeld (Simon and Schuster, 2009) about a fantastic WWI adventure aboard a flying whale, and interactions with a world of genetically fabricated beasts. The following video gives a brief intro to the research as it stood in early 2014:
Our main sponsor Intel was interested in exploring the future of storytelling within the emerging mediums of VR and AR: what kinds of stories did artist’s desire to tell, and what kind of technology was required to realize these storyworlds? We utilized World Building as the primary design methodology for creating stories. World Building designates a narrative practice in which the design of a world precedes the telling of a story; the richly detailed world becomes a container for narrative, producing stories that emerge logically and organically from its well-designed core.
During the spring and summer of 2013 I helped a team of students experiment with creating interactive narratives inside VR where users could walk around the Leviathan ship and interact with virtual crew members playing out a multi-threaded narrative. To achieve this we used the Unity game engine and integrated a Phasespace motion capture system and head mounted displays created at the USC MxR Lab.
(Custom head mounted display made at the USC MxR Lab and tracked by Phasespace.)
During the summer of 2013 the team experimented with integrating an Augmented Reality experience with our VR experiments. We wanted to create an experience where the Leviathan whale could feel as if it was flying around the real world and serve as an introduction to the VR narrative. To view the flying whale we utilized Intel ultrabook (laptop/tablet hybrids) and software from Intel’s Perceptual Computing Group, now known as RealSense, for handling the motion tracking of the ultrabook. The ultrabook worked well for individual viewers, but we wanted to add another POV for the rest of the audience so we used a video camera on a tripod and real time compositing software to display on a large projection screen the whale flying over the audience.
These experiments were well received and in the fall of 2013 Intel elected to showcase the work at CES 2014. We took our findings from the World Building Media Lab and assembled a team of professionals at 5D Global Studio to help polish and prepare the content for the CES showcase. The end product involved 3 AR experiences in total. The first experience, named “Spotlight”, ran as part of the Intel booth in the exhibition hall. Spotlight involved 15 ultrabooks networked together that attendees could walk up and use to watch the Leviathan whale fly off the screen and into the booth area, as well as use the touchscreens to interact with flying jellyfish creatures named Huxleys that flew in tandem with the Leviathan.
The following video shows off some of this work:
The third AR experience was part of the Intel Keynote and again involved flying the Leviathan whale over the audience. Instead of just compositing 1 POV audience camera, we mapped out 5 different cameras setup around the Keynote floor. This event is shown in the following video:
In terms of the artwork for CES, I helped directly with modeling and texturing the Leviathan, and creating shaders for the Huxleys.
The summer of 2014 involved a return to the World Building Media Lab with a team of students to again explore interactive narratives in VR. We recruited a highly diverse group of students to engage in a world build for the first half of the summer, followed by developing a multi-threaded script, theatrical rehearsal, virtual prototyping, final production, and user testing. We integrated motion tracked physical props to allow viewers to reach out and touch the virtual environment, as well as playing with shifting a VR viewer’s point of view between first person human POV, third person virtual camera POVs where a viewer can see themselves from above, and a first person disembodied viewpoint where the viewer assumes the point of view of a flying Huxley.
This video shows shows how users could interact with these props, including a physical chair recreated virtually using photogrammetry.
Again, the innovative work of the student team was well received and Intel and Unity Technologies sponsored a team at 5D Global Studio to take the research and polish it for the 2016 Sundance New Frontier exhibition. For this phase I handled the Lead Design duties to craft a 7 minute experience where viewers with no prior knowledge could enter the world and be lead through a story by virtual characters and achieve a goal of crafting their own genetically fabricated Huxley jellyfish.
The following video is a sizzle reel of this work:
This video is a real time playthrough of the complete VR experience where viewers were able to interact with motion tracked physical props within the interactive narrative world:
Here is a behind the scenes look at the creation of the experience: