For those in a hurry, my resume is here.
I've always been fascinated by illusions, immersion, and atmospheric environments. I studied Fine Art, Animation, and VR at Iowa State University from 1998-2002. Towards the end of undergrad I started to shift my focus from animation towards a desire to create full experiences. At this time I began experimenting with interactive installations and VR. After experiencing a CAVE projection room at VRAC I volunteered there and spent time tinkering with visualizations of the Taj Mahal and animated humans.
During this time I was doing a lot of animation work and studying under the mentorship of Steven Herrnstadt. From 2001-2003 I worked with Steve and others at Micoy helping with shooting 360-3D video and compositing animated content in 360. We were all excited by the potential of the technology, and worked passionately to make it as realistic as possible. We were a bit too early for the market, during these days the VR headsets available were expensive, had low resolution displays and had narrow fields of view. I realized at this time that for the market to emerge, the technology needed to catch up. So off I went to graduate school.
From 2003-2006 I studied VR and Game Design at the University of Southern California's Interactive Media Division. There I studied under the mentorship of many great professors, including Scott Fisher, Mark Bolas, Perry Hoberman, and Tracy Fullerton. This was an incredibly inspiring time where a lot of us could feel that there was a watershed moment approaching in the field of interactive media as phones became smart, and VR headsets became cheaper and better. A central question during this time was: how were phones going to change the way we live and interact with one another, and will VR enjoy a second renaissance after the false start during the 1990s.
During grad school I began working at the Institute for Creative Technologies. Described by Skip Rizzo as the "unholy alliance of Academia, Hollywood, and the Military" the ICT was not focused on military precision or efficiency, but rather training service members to make better decisions, and helping them adjust to life back home upon their return from deployments.
From 2003-20012 I worked on VR and Mixed Reality research at the ICT. My time was split between working with Skip Rizzo on VR Therapy for Post Traumatic Stress, mixed reality displays, headsets, and content with Mark Bolas at the MxR Lab. During this time I got my hands dirty in the trenches of production creating interactive systems that had to be deployed to clinics across the country, and tinkering with large scale motion capture environments where people could walk around an entire warehouse in a Fakespace 150 degree FOV HMD. I was like a kid in a candy store and gained invaluable insights into the human experience of XR (VR, Mixed Reality, and AR) by working alongside Skip and Mark.
Around 2011 Mark started an initiative to create a low cost but high fidelity HMD. Due to the economies of scale of cell phone components LCD screens were becoming cheap and IMUs were becoming much better. This meant more resolution and less motion sickness. Wide field of view optics were still difficult to make cheap, but a cheap lens could handle 90 degrees of FOV. That fall a headset that combined a cheap LCD, LEEP wide field of view optics, and Phasespace motion tracking was put together by Mark, Thai Phan, Evan Suma, Palmer Luckey, and myself. When we put it on we realized that a watershed moment for affordable consumer VR was approaching. It was just a matter of how fast.
We showed off the HMD at Sundance's New Frontier exhibition in 2012 with Nonny de la Peña's immersive journalism work. Attendees were amazed by what they saw. After the Oculus Rift Kickstarter, things accelerated even faster than we all thought it would.
At this point I shifted away from technology and towards content. I jumped back over to the USC School of Cinematic Arts and the World Building Media Lab. From 2012-2015 I worked with Alex McDowell, who had shifted from his innovative work on films like Minority Report into VR and AR. While at the WBML I helped lead research into world building for immersive interactive narratives. Our most in-depth project was Leviathan, which included several incarnations spanning VR, AR, and Mixed Reality. We took it on the road twice to CES and Sundance, each time pushing the boundaries of the technology and exploring new kinds of immersive audience experiences.
In 2015 I founded RareFaction Interactive with Adam Saslow where we worked on an underwater badminton game, an unreleased VR music game, scientific visualization of the brain, architectural visualization.
Since then I have been working as a freelance designer and tech artist, and worked on a VR game about gardening called Fujii.