Homebound: The Interactive Immigrant Experience
-Funded Thesis Research Project-
Spring 2020
Homebound is an interactive Virtual reality experience developed in unreal engine 4. designed to be a real-time simulator of the immigrant experience, homebound is a visual and interactive narrative that puts viewers through the harsh realities of being an illegal immigrant. homebound was funded by the Walter & Lalita Janke Emerging Technologies Fund seed grant.
2018-2019
Early research of this project created by MFA Student Alberto Alvarez and Brandon Martinez. 
Mitchelville AR Tour 
-Research Assistant-
Professor Christopher Maraffi received a National Endowment for the Humanities (NEH) Digital projects for the Public “Discovery” grant and a Walter & Lalita Janke Emerging Technologies Fund seed grant to begin design work on an augmented reality tour of Mitchelville, a historic site on Hilton Head Island SC which was the first Freedman’s town in the US during the Civil War, and a Gullah-Geechee heritage site today. We will be creating a 360 experience of the story of America’s first efforts at Civil Rights for African Americans during the Reconstruction period as it relates to the Port Royal Experiment, with life-sized historical figures like Harriet Tubman, who will be experienced on site with Magic Leap headsets and mobile phones.
Fall 2019
For the first proof of concept for developing content for various platforms, funded MFA students Alberto Alvarez, Brandon Martinez and Ledis Molina along with other Graduate and Undergraduate students of the School of Communication and Multimedia Studies have integrated various technologies to apply the emerging field of Virtual Production with a humanities project. The video below gives an introduction to Virtual Production and depicts scenes from a work-in-progress animation.
Summer 2019
Early Testing for object recognition, demonstration used as pitch to generate AR app to recognize museum pieces at MODS and eventually the Mitchelville site. Using ARKit2, creators can store image recognition and object recognition to superimpose  content for users to interact with.

Initial Object Scan for Unity.

3D Object is instantiated based on recognition of physical space.

MODS Diorama was previously scanned and object spawns based on volume recognition.

Autonomous Car Study
-Research Assistant-

Professor Topher Maraffi received a Dorothy F. Schmidt College of Arts and Letters seed grant to begin working on an interdisciplinary project to develop a driving game simulation using the Magic Leap One headset that will be used to inform autonomous vehicle design (in collaboration with Computer Engineering faculty Dani Raviv, Hari Kalva, and Aleks Stevanovic). Their game simulation will track what highly ranked drivers do with their eyes and bodies to successfully maneuver a vehicle in an urban environment. 
Summer 2019
MTEn MFA students Alberto Alvarez and Brandon Martinez will be developing this project throughout the 2019-2020 academic year. The following video is a proof of concept that demonstrates a use case for VR and the image below depicts a Magic Leap Prototype that demonstrates points of interest by tracking eye movement of the user while driving in a car simulation game. 

Video Captured by Topher Maraffi

Magic Leap One On board Capture

Back to Top