SIMTAINER: An Immersive VR Futures Narrative (at the United Nations)

Simtainer is a VR experience I built about the importance of adaptable infrastructure, specifically underutilized shipping containers. When users put on the headset, they are taken on a journey through shipping containers repurposed as housing, medical clinics, and farms to meet our ever-changing infrastructure needs. Simtainer will be showcased at the United Nations General Assembly during October.

 

SIMTAINER@TYF17 (Gray Area Fest Edit) from Toshi Hoo on Vimeo.

Virturaunt- Oculus Rift Restaurant Simulator

The Virturaunt (Virtual restaurant) project aimed to explore the incorporation of food into VR to better understand the cognitive process behind food consumption.

By building an immersive restaurant environment in Unity, and populating it with sample food items, players/participants were provided the opportunity to select virtual food items they would like to ‘eat’.  Upon the press of a button, an animation played which showed the selected food item flying slowly toward, and then under, the player camera, mimicking what happens when we put food in our mouths.  Simultaneously, the player was fed real food by a volunteer administrator.

 

Screenshot from sim
Screenshot from Unity build

This created the opportunity for managed cognitive dissonance.  By allowing the player to select the virtual 3D model of an orange, watching that orange fly under the player camera, and then being fed a piece of lime or grapefruit, the player became confronted with the challenge of recognizing the dissonance.

RESULTS: Over 20 tests, 13 players failed to realize they were being fed grapefruit instead of orange, 11 players failed to realize they were being fed lime instead of lemon, and 10 players failed to realize they were being fed apple instead of pear.  It was far easier to fool players around taste than texture.

The point of this exercise was to explore how reality and virtuality can be integrated around selective strengths- imagine a real restaurant environment where you wear an Oculus Rift during the experience to focus your attention on the food, allowing you to savor it more completely.  Imagine if these virtual environments could be customized based on individual preferences.  This same technology has the ability to aid in portion control, food perception, and could be applied to taste-testing and to better understand, statistically, the impact of ambience on the restaurant experience.

Cool stuff!

If you’re interested in a demo or to download my Unity build, contact me on Twitter (at)asaulgoldman.

Pirate Island: A Language-Learning Game (Published in IDC)

During his final semester at CMU, Alex conducted extensive research into how children and adults successfully learn first and second languages.  The result of months of research was the testable hypothesis that spacial memory could be successfully leveraged as a platform to expedite the process of learning vocabulary.  In other words, if you label things in a virtual space, it helps people learn the names of those things quickly.  The other major insight from research was that children learn languages by ‘guessing’. Young children are constantly hearing new words, guessing what they mean, and then playing around with those guesses to ‘test them out’.  In other words, being told if you’re right or wrong when learning a language runs counter to how our brains learn one the first time around.

To test these hypotheses, Alex built a roll-playing game that integrates real videos into a Click-to-Explore virtual world.  Players encountered multiple parallel quests that could be resolved in a variety of orders.  In order to solve the quests, players had to guess the meanings of the words used by NPCs (Non-player characters), with the help of visual subtitles.

His work was featured in the 2013 Interaction Design for Children conference.

 

 Screen Shot 2014-11-26 at 2.43.55 PM

Umbra: Beyond Shadows (CHI finalist)

In a game design course in 2013, Alex and his classmates decided to push the boundaries of how we think about the Microsoft Kinect.  The end result: an entirely new interface. Umbra: Beyond Shadows uses 2 Kinects, 2 projectors, and a semi-transparent screen to create a physical game environment akin to a volleyball court: 2 teams on 2 sides of a flat surface.  In the case of Umbra, the rear-mounted projectors created intentional shadows on the screen, standing in for player avatars. The game leveraged extensive research into the relationship between how realistic avatars are and cross-player experience.  Umbra succeeded in making players feel uncomfortable expressing outwardly antagonistic behavior toward their highly realistic avatar opponent.

Umbra: Beyond Shadows was a finalist in the CHI 2013 Student Game Design Competition under the category of Innovative Interface. Read the 2 page paper here.

 

SkyGods of Magmarock

SkyGods of Magmarock was a 2 week long project built across 2 networked Jam-O-Drums.  2 teams of 4 compete in a straightforward capture-the-flag adaptation with 2 minor twists: first, the game leverages a fog-of-war perspective, so that each team can only see in a narrow radius around its units, and second, all players have the ability to slow down opponents with a fire-breathing attack.  This simple combination enables powerful cooperative potential without complicated rules; anyone who’s played capture the flag instinctively understands.

Alex played the role of texture artist and interaction designer on this project.

Power UP! Madeira

Power UP! Madeira was a semester-long project involving extensive research.  The goal of the project was to build an interactive multiuser exhibit for Casa da Luz, Madeira’s museum about science, electricity, and power generation.  Our solution was an RTS in which players manage their own resources (Money and Electricity) in order to build the island’s shared network of power stations.  The major challenge is in balancing renewable energy, which is clean and cost-effective after construction but expensive to build and unreliable, with conventional natural gas power, which is cheap to build and reliable but generates pollution and costs money to supply.

Our resulting four player game, built on a handmade touchscreen, is a resource management simulation in which money, public satisfaction, pollution, and electricity must be balanced to meet the needs of a demanding populace with diverse needs.

Project V.E.S.S.E.L.

Project V.E.S.S.E.L. was an incredibly ambitious 3 week long project involving the creation of an entirely immersive virtual and physical world.  Our solution involved 2 PSMoves, 2 Wiimotes, and a Jam-O-Drum to simulate the deck of a ship shrunk down to fight viruses within the human body.  The pilot steers using a helm and accelerator.  2 gunners use their PSMoves to fire on malicious infections.  A navigator, meanwhile, has a top-down view of the world, and gives directions to the pilot on how to navigate the complex maze of vessels to find and destroy the enemy.  The navigator’s contribution to the team is in sharing information and making navigational recommendations.

Alex was producer and designed and built the set.

Zen Canyon Paragliding (Adobe Design semi-finalist)

Zen Canyon Paragliding was a 2 week long project collaboration during the course Building Virtual Worlds.  Alex played the role of texture artist, producer, and interaction designer.  The game hinges around the player’s use of the paragliding apparatus as an immersion-building component and control tool.  While the game uses the Kinect for Windows, the player pulls down on the paragliding steering cords in order to turn.  The use of the apparatus serves to build off the player’s natural control language, so that even though players might not have gone paragliding before, a very simple set of instructions serve to clarify how to navigate the world.

The game world is also designed to aid non-native gamers.  Our task for this assignment was to build a game that naive users could play the first time and win- many of whom have no experience with virtual worlds whatsoever.  In Zen Canyon, the constant wind pulls players in the right direction through a series of branching and then converging path options.

For this piece, our team was awarded semi-finalist position in the Adobe Design Achievement Awards in the installation category.

Fish & Ships

The Fish & Ships game was produced over the course of a single week by a team including Alex, 2 Programmers, and a 3D Modeler.  Alex filled interaction design role, producer, and texture artist.

The core experiment in Fish & Ships is to test a new interaction style- that is, how can Natural User Interfaces (in this case, the PSMove) be integrated into a larger, multiuser environment?  Additionally, what novel opportunities exist for multiuser interactions built around teamwork and collaboration?  Previous BVW (Building Virtual Worlds) projects have experimented with crowd control techniques and technologies, and Fish & Ships takes a similar approach to a point.  Where it diverges is in treating the audience as a single superorganism, in which individuals have more or less control and impact during different periods of gameplay.  This is achieved by having players pass the PSMove between each-other in order to keep their school of fish alive.  As fitting with the theme of the game, teamwork is necessary for survival.

The fascinating thing about playtesting was that players didn’t actually need to have or hold a controller in order to feel agency within the game; in fact, simply by encouraging other audience members who held the PSMove to pass it elsewhere, players found enough reward to want to “play again”, even though some didn’t even touch the controller at all during the initial period of play.