For this research project, I coordinated an expert workshop in NYC, designed the agenda and analyzed and synthesized the output with a team into the following report on what IFTF calls ‘Ambient Communication’.
You can read the full report HERE.
UX Research Manager
For this research project, I coordinated an expert workshop in NYC, designed the agenda and analyzed and synthesized the output with a team into the following report on what IFTF calls ‘Ambient Communication’.
You can read the full report HERE.
For this public report, I worked with our forecasting and research teams to distill the most compelling and important ideas from a series of expert workshops into a map and report on the new affordances and capacities for automation, and how they are likely to impact everyday life in the coming decade.
You can read the full report HERE.
Simtainer is a VR experience I built about the importance of adaptable infrastructure, specifically underutilized shipping containers. When users put on the headset, they are taken on a journey through shipping containers repurposed as housing, medical clinics, and farms to meet our ever-changing infrastructure needs. Simtainer will be showcased at the United Nations General Assembly during October.
SIMTAINER@TYF17 (Gray Area Fest Edit) from Toshi Hoo on Vimeo.
The Virturaunt (Virtual restaurant) project aimed to explore the incorporation of food into VR to better understand the cognitive process behind food consumption.
By building an immersive restaurant environment in Unity, and populating it with sample food items, players/participants were provided the opportunity to select virtual food items they would like to ‘eat’. Upon the press of a button, an animation played which showed the selected food item flying slowly toward, and then under, the player camera, mimicking what happens when we put food in our mouths. Simultaneously, the player was fed real food by a volunteer administrator.
This created the opportunity for managed cognitive dissonance. By allowing the player to select the virtual 3D model of an orange, watching that orange fly under the player camera, and then being fed a piece of lime or grapefruit, the player became confronted with the challenge of recognizing the dissonance.
RESULTS: Over 20 tests, 13 players failed to realize they were being fed grapefruit instead of orange, 11 players failed to realize they were being fed lime instead of lemon, and 10 players failed to realize they were being fed apple instead of pear. It was far easier to fool players around taste than texture.
The point of this exercise was to explore how reality and virtuality can be integrated around selective strengths- imagine a real restaurant environment where you wear an Oculus Rift during the experience to focus your attention on the food, allowing you to savor it more completely. Imagine if these virtual environments could be customized based on individual preferences. This same technology has the ability to aid in portion control, food perception, and could be applied to taste-testing and to better understand, statistically, the impact of ambience on the restaurant experience.
Cool stuff!
If you’re interested in a demo or to download my Unity build, contact me on Twitter (at)asaulgoldman.
In my 9-5 at IFTF (Institute for the Future), I design workshops, exercises, and games to help some of the world’s largest companies think productively about the future. I also research and lead projects on the Future of Work and skills training for foundations and governments. In addition to extensive proprietary research, I also publish public work and organize public events. Here is an (updated) short list on public posts and public research I’ve worked on.
3 Invaluable Work Skills for 2018
Public Talk at Burning Man Global Leadership Convention
Co-running a Future of Museums Hackathon at the Metropolitan Museum
Running an event to explore the forefront and future of AR/VR
Launching our online interactive forecasting map (interaction designer)
If you’re curious about any of these or other work, reach out to me (at)asaulgoldman.
During his final semester at CMU, Alex conducted extensive research into how children and adults successfully learn first and second languages. The result of months of research was the testable hypothesis that spacial memory could be successfully leveraged as a platform to expedite the process of learning vocabulary. In other words, if you label things in a virtual space, it helps people learn the names of those things quickly. The other major insight from research was that children learn languages by ‘guessing’. Young children are constantly hearing new words, guessing what they mean, and then playing around with those guesses to ‘test them out’. In other words, being told if you’re right or wrong when learning a language runs counter to how our brains learn one the first time around.
To test these hypotheses, Alex built a roll-playing game that integrates real videos into a Click-to-Explore virtual world. Players encountered multiple parallel quests that could be resolved in a variety of orders. In order to solve the quests, players had to guess the meanings of the words used by NPCs (Non-player characters), with the help of visual subtitles.
His work was featured in the 2013 Interaction Design for Children conference.
In a game design course in 2013, Alex and his classmates decided to push the boundaries of how we think about the Microsoft Kinect. The end result: an entirely new interface. Umbra: Beyond Shadows uses 2 Kinects, 2 projectors, and a semi-transparent screen to create a physical game environment akin to a volleyball court: 2 teams on 2 sides of a flat surface. In the case of Umbra, the rear-mounted projectors created intentional shadows on the screen, standing in for player avatars. The game leveraged extensive research into the relationship between how realistic avatars are and cross-player experience. Umbra succeeded in making players feel uncomfortable expressing outwardly antagonistic behavior toward their highly realistic avatar opponent.
Umbra: Beyond Shadows was a finalist in the CHI 2013 Student Game Design Competition under the category of Innovative Interface. Read the 2 page paper here.
As part of a semester-long project, Alex produced and generated 2D art for “Know Weather, No Problem”, a client project weather app designed for the microclimates of Madeira, Portugal. The app made use of pop-up book playfulness to generate gameful interactions in the unspeakably dry climate of weather apps.
SkyGods of Magmarock was a 2 week long project built across 2 networked Jam-O-Drums. 2 teams of 4 compete in a straightforward capture-the-flag adaptation with 2 minor twists: first, the game leverages a fog-of-war perspective, so that each team can only see in a narrow radius around its units, and second, all players have the ability to slow down opponents with a fire-breathing attack. This simple combination enables powerful cooperative potential without complicated rules; anyone who’s played capture the flag instinctively understands.
Alex played the role of texture artist and interaction designer on this project.
Power UP! Madeira was a semester-long project involving extensive research. The goal of the project was to build an interactive multiuser exhibit for Casa da Luz, Madeira’s museum about science, electricity, and power generation. Our solution was an RTS in which players manage their own resources (Money and Electricity) in order to build the island’s shared network of power stations. The major challenge is in balancing renewable energy, which is clean and cost-effective after construction but expensive to build and unreliable, with conventional natural gas power, which is cheap to build and reliable but generates pollution and costs money to supply.
Our resulting four player game, built on a handmade touchscreen, is a resource management simulation in which money, public satisfaction, pollution, and electricity must be balanced to meet the needs of a demanding populace with diverse needs.