Embodied Interaction Model
Storyboards | Journey Maps | Prototypes | Script Writing
An interaction model for a hypothetical Virtual Reality application, which assists users in confronting their fear of public speaking. Leveraging room-scale UI patterns to prepare users physically and mentally to confront their fear of public speaking.
“The right to breathe, the right to be physically unashamed, to fully vocalise, to need, choose and make contact with a word, to release a word into space - the right to speak."
-Patsy Rodenburg, "The Right to Speak: Working with the Voice"
The right to breathe...
Text Neck GUIs Collapse User's Chests/Lungs
I started off by researching already existing UI patterns in the VR public speaking landscape. I immediately noticed that the Graphic User Interfaces that were contained to controllers had a sort of text-neck like effect on users. Where users were collapsing their chests and lungs. A posture that works against users in confronting their fear, as well as creating deficient speaking habits. These mid-tier head mounted displays are heavy (about 1 pound, which is a lot more head to have) and literally pull your head down throughout the session.
My recommendation is to move the interface off of the controller and up to a point in the sky. To give you a sense of what this might feel like, I've been referring to this pattern as, "Sky Writing". This mechanically asks users to stand up and look out toward the horizon, a visualization used by voice coaches to empower students in self-assessment of their posture and breathing.
The right to be physically unashamed...
All the users I interviewed mentioned this moment of deep anxiety surrounding "just standing" on stage, waiting their turn to speak.
They expressed that they started to really notice their bodies, their sweatiness, their hands, running their minds over their own speech. Performance anxiety also involves performing tasks in front of others, standing in front of people.
To fully vocalize...
Get the user talking!
"There's a friend of mine I think you'd like to meet. They're just over the hill there. When you're ready to meet them, call out, "Hey!"."
The headphones make it difficult to hear yourself talk.
This experience provides users with an early diagnostic of their own volume level, demonstrating how far their voice has traveled through space. It also mechanically requires that users begin confronting their own voice, projection. And at the end of the experience, the user has conjured up a friend with their voice.
To need, choose and make contact with a word...
"I want to go home"
I recommended a safe zone that is also your landing screen. A place where the horizon literally breathes with you. Users can even return to this space at any time during sessions by saying, "I want to go home." This not only builds trust and a sense of safety for our users within the virtual space but it also shows them the power of their own voice as they summon this self-teleportation to safety.
Welcome to the Safe Zone
Existing applications vary but often users are starting immediately in a social space, sometimes even populated with embodied avatars. Creating space for users to assess their emotions and beliefs is a critical moment for behavior change. Without it, where are our most vulnerable users left after these intentionally anxiety-driving environments?
To release a word into space - the right to speak.
- Patsy Rodenburg, "The Right to Speak: Working with the Voice"
Prototyping for VR
Storyboarding & Interaction Testing
What they see, hear, think, feel, say.
I tracked the users flow in storyboard moments that I then constructed into scenes, environments and available interactions.
I wanted to design this interaction model prioritizing a user whose social phobia (fear of public speaking) prevents them from confronting the requisite barriers for getting help.
Even when people do get help, sometimes they struggle at parts in the therapy that require imagination. This VR technology allows us the potential to empower users by visualizing something their fear won't even let them imagine, like a successful social interaction.
2D Journey Maps
I first used these 2D journey maps to gain insight onto the current user experience within existing applications. This helped me identify missed opportunities and places where this particular technology could be leveraged.
In this graphic, you'll notice one of the users journey's ends at a high point, which represents a high level of anxiety. Current applications abandon their most vulnerable users in a state of anxiety after the immersive experience.
I used these 2D experience maps to keep direction throughout the project. They were continually updated or supplemented throughout the process in order to keep them alive. To ensure that these maps didn't become "flat" for this 3D experience, I tracked sensory inputs and the spatial storyboard. (Which you can see in the final map, "Session 0")
Do it live!
If a user is present (the feeling of being there) in the VR environment, their actions tend to mimic their "real-life" responses. These Wizard of Oz like, live tests were conducted using headphone audio through a phone call. These tests were used to evaluate users willingness, comfort and ease of performing embodied interactions: "enter the room", "take a bow", "wave", "say thank you and exit". One of my favorite insights from this research was: if you put a baseball sized object close to someones face, they are going to try to grab it.
Just ask them.
Tacked onto the end of this experience I had a brief cool-down experience for users wherein I asked users to,
"Close your eyes."
Despite being out in a public space, all the users closed their eyes. I had spent a majority of my time exploring ways in guiding users and this verified one of the simpler solutions, just ask them. Given the nature of therapeutic applications, users are intentionally there to better themselves. This willingness of participation afforded to the application, shouldn't be ignored.
Building in the Medium
One of the challenges of designing for VR is the consideration of scale. How large spaces, objects, interfaces are. With these prototypes I focused on these questions of scale. How large spaces, objects, interfaces, characters are.
Designing for Real? Virtual? Space?
Mapping the experience in real space revealed a multitude of dimensions in the design. Namely, scaling for the physical play space in relation to the virtual space.
While the virtual space is potentially limitless, the physical play space is not. Mapping out these experiences presented the inevitable confrontation with locomotion, how are users moving between these virtual spaces? How do we best leverage the whole play-space?
I mapped out user paths & interactions in space: discovering under-used areas, unintentionally repetitive interactions, and where I needed to reconsider full-body interactions.
For example, I initially had an experience where users came in and took a bow. I had this interaction too near the edge of the play-space, which when tested I realized... could result in someone ramming their head (and expensive headset) face-first into a wall.
Presenting the findings and recommendations by modelling the interactions to demonstrate their functionalities.
Going forward, I am working to move my VR storyboard into an interactive prototype guided with audio.
Calm Technology: Principles and Patterns for Non-Intrusive Design
Amber Case 2015
An Experimental Study on Fear of Public Speaking Using a Virtual Environment
Virtual Reality Exposure in the Treatment of Social Phobia
G. Riva, C. Botella, P. Légeron and G. Optale (Eds.) 2006
Body Centred Interaction in Immersive Virtual Environments
Mel Slater and Martin Usoh, 2004
Use of VR as therapeutic tool for behavioral exposure in the ambit of social anxiety disorder treatment
VR Lab Denmark, 2006
Interpersonal Distance in Immersive Virtual Environments
Univ. Of Calif, Santa Barbara 2003