Skip to content

Consent and Vulnerability in Mixed Reality: Exploration of ethics in emerging technology

fatmonky edited this page Sep 3, 2019 · 4 revisions

This article is about a week-long project, exploring how psychological vulnerability plays a role in immersive experiences, and how consent acts as a switch that can either enhance or diminish psychological vulnerability. We will briefly describe the project, with some key pictures.

The project team consisted of Hiroki Sato, Peijing (“PJ”) Teh, Suniti Thapa, and Zuyi (Joey) Huang, during the Immersive Experiences course at Copenhagen Institute of Interaction Design (CIID)’s Interaction Design Programme (IDP) 2019, facilitated by instructors James Tichenor and Joshua Walton.

Most of this writeup is replicated (with nice pics) in this Medium article here. The detailed Unity Tutorial is also here in Medium.

Concept origin

In the first week’s exploration in our Immersive Experience class, a common denominator across a few projects (including the Knife Game, and World on My Shoulder) was how the participants felt a sense of vulnerability. Further, the participants shared that this sense of vulnerability immersed them even more into the experience. This motivated our team to explore designing for vulnerability in the second (final) week of the class.

From some initial research and reflection, we got a few insights:

  • In the Knife Game project, users felt insecure and vulnerable when they were asked to put their hands inside the wooden box containing the solenoids. One user was so frightened by the prospect of the virtual experience, that she kept pulling out her hand before the experience climax happened: she only completed the whole experience on the third try! For some participants, the lack of visibility of the hand, coupled with the virtual animation, resulted in a sharper perception of a sting when the “knife” “stabbed” the participant’s finger.

A couple of key insights were derived from this article

  • With the removal of the Fourth Wall, users in immersive experiences find it harder to separate reality from virtual worlds.
  • Consent thus becomes very important, and should be asked at each step of the experience.
  • Vulnerability is created by a combination of different factors: physical factors (e.g. location of one’s limbs, what is visible or audible), social factors (e.g. whether your experience is determined by yourself, by the computer or by others) but also playing around with sequence, timing and phrasing of consent.

On the second day of the project, we bodystormed an existing storyboard, and tried to use consent. This initial story involved taking the participant’s hand, and turning it into a jellyfish which then explores a different universe.

Through bodystorming, we realised a few things:

  • Consent is a double-edged sword. It can create safety (“Do you want to proceed?”) but it can also create vulnerability (“Would you allow the rest of the audience to control your jellyfish/hand?”) We decided to focus our project to explore consent as a means of creating safety but also vulnerability.
  • The transformation of the hand into a jellyfish didn’t work, as it disrupted the immersion, and also caused a sense of “disownership” of the hand.

Project summary

We thus decided to frame the entire project as an experiment into how consent switches psychological vulnerability. We also decided to go beyond a negative experience, and to have a more neutral/positive experience, in the form of a “tickling” feather. To add a Stanley-Milgram-esque twist to consent which might increase vulnerability, we also added an element of “crowd control” in the immersive experience: at the end of the experience, if the participant gave their consent, the “stabbing” or “tickling” of their hand would be controlled by a member of the audience.

Project Construction

To construct the setup, we built up the virtual/visual elements, physical components, and finally connected them together using Spacebrew; the whole setup was also placed in a rough cardboard/foam setup. The detailed documentation can be found on Github here

On the third day of the week, we built a miniature version of the whole setup, closing the loop from Unity -> Spacebrew -> Raspberry Pi, for one of the solenoids.

(PICTURE OF CLOSED LOOP)

Visual Elements

We used Unity 3D to create our visuals for the immersive experience. After drawing a rough concept of our key animation, we decided on the assets (a hand, a knife and a feather) and downloaded them from Google Poly. We created different scenes of different animations in the animation window (tutorial of Unity animations can be found here), for the hand cutting and hand tickling scenes. We then used UI components in Unity to create different screens of consent, which were a critical element of our concept to test the impact of consent on the experience of vulnerability.

The detailed Unity tutorial with accompanying pictures is here.

<Consent Scene (Button Transition)> -Create UI > Button : right click in Assets folder > Create > C# Script -Create menuStart.cs file:

using System.Collections; using System.Collections.Generic; using UnityEngine;

public class menuStart : MonoBehaviour { public void ChangeScene(string sceneName) { Application.LoadLevel(sceneName);

-Add menuStart.cs to Main Camera in Inspector -In Inspector for Button, drag Main Camera from Hierarchy to On Click(), select menuStart > changeScene, type the Scene to transition to

-In the Scene to transition to, File > Build Setting > Add Open Scenes -Go back to the Scene to transition from and hit run.

<Knife Scene(Count Transition)> -Import SpacebrewClient.unitypackage -Drag SpacebrewObject from Assets to Hierarchy
-Update SpacebrewEvent.cs to the one below:

using UnityEngine; using System.Collections;

public class SpacebrewEvents : MonoBehaviour {

SpacebrewClient sbClient;
public GameObject animationObject;

// Use this for initialization
void Start () {
    GameObject go = GameObject.Find ("SpacebrewObject"); // the name of your client object
    sbClient = go.GetComponent <SpacebrewClient> ();

    // register an event with the client and a callback function here.
    // COMMON GOTCHA: THIS MUST MATCH THE NAME VALUE YOU TYPED IN THE EDITOR!!
    sbClient.addEventListener (this.gameObject, "mystring");
    sbClient.addEventListener (this.gameObject, "HandInteraction");
}

public void SendMSGRight() { sbClient.sendMessage("RightCollision", "boolean", "true"); }

public void SendMSGLeft() { sbClient.sendMessage("LeftCollision", "boolean", "true"); }

public void SendMSGCenter() { sbClient.sendMessage("CenterCollision", "boolean", "true"); }

// Update is called once per frame
void Update () {
    if (Input.GetKeyDown ("space")) {
        print ("Sending Spacebrew Message");
        // name, type, value
        // COMMON GOTCHA: THIS MUST MATCH THE NAME VALUE YOU TYPED IN THE EDITOR!!
        sbClient.sendMessage("mybool", "boolean", "true");
    }
}

public void OnSpacebrewEvent(SpacebrewClient.SpacebrewMessage _msg) {
    print ("Received Spacebrew Message");
    print (_msg.value);
}

}

-Set Publishers as below and Server Address. -Drag Knife to AnimObject in SpacebrewEvents

-Import Knife_01.obj, hand.obj, Plume.3ds (Assets > Import New Assets) -Select the objects and drag to Hierarchy -Locate and scale as you like in Transform in Inspector -Select the knife object in Hierarchy, then Window > Animation > Animation -Create > Name as Knife Animation > Add Property > Transform > Position (Press +) -Select certain frames in the time line and set key frames to change Position.y -In inspector for the Knife object, AddComponent > name as “KnifeMsg” > replace the script as below:

using System.Collections; using System.Collections.Generic; using UnityEngine;

public class KnifeMsg : MonoBehaviour { //Spacebrew start SpacebrewEvents sbEvents;

// Use this for initialization void Start() { GameObject go = GameObject.Find("SpacebrewObject"); sbEvents = go.GetComponent(); }

public GameObject cloneScript;

public GameObject BloodAnimation;

public GameObject SpacebrewScriptRightCollision;

void StopBloodAnimation() { ParticleSystem _CachedSystem = BloodAnimation.GetComponent(); _CachedSystem.Stop(); Debug.Log("Blood stopped"); }

void StartBloodAnimation() { ParticleSystem _CachedSystem = BloodAnimation.GetComponent(); _CachedSystem.Play(); Debug.Log("Blood stopped"); }

void SpacebrewSendCollisionCenter() { SpacebrewScriptRightCollision.GetComponent().SendMSGCenter(); Debug.Log("CENTER Activated"); }

void SpacebrewSendCollisionLeft() { SpacebrewScriptRightCollision.GetComponent().SendMSGLeft(); Debug.Log("LEFT Activated"); }

void SpacebrewSendCollisionRight() { SpacebrewScriptRightCollision.GetComponent().SendMSGRight(); Debug.Log("RIGHT Activated"); }

void OnCollisionEnter(Collision collision) { // Debug.Log("colliion started"); }

void OnCollisionStay(Collision collision) { // Debug.Log("Stay occuring.."); }

void OnCollisionExit(Collision collision) { //Debug.Log("Exit called CENTER..."); //if (collision.gameObject == exitCollisionObject) //{ //cloneScript.GetComponent().makeObject(); //SpacebrewScriptRightCollision.GetComponent().SendMSGCenter(); //}

}

// Update is called once per frame void Update() {

} }

-Drag SpacebrewObject from Hierarchy to SpacebrewScriptRight

-Assets > Import Custom Package > Sphere.unitypackage > drag Sphere in Hierarchy -For setting in Knife’s Animation View, set the key frame to fire Spacebrew signal > Add Event (Image Below) > Change Function in Inspector to either one of the Collisions.

-In the frame count 0, Add Event > Function : StopBloodAnimation() and in the frame to start the blood animation, Add another Event and set Function to StartBloodAnimation()

-In Hierarchy, right click > Create > Create Empty > rename as Scene Manager -Add Component and replace the script as below with the file name “SceneTransitions2.cs”:

using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.SceneManagement;

public class SceneTransitions2 : MonoBehaviour {

public Animator transitionAnim; public string sceneName;

void Update(){

   //if(Input.GetKeyDown(KeyCode.Space))
   //{
       StartCoroutine(LoadScene());
   //}

}

IEnumerator LoadScene(){ //transitionAnim.SetTrigger("knifeGoDown"); yield return new WaitForSeconds(10f); SceneManager.LoadScene(sceneName); } // Start is called before the first frame update }

-Drag Knife from Hierarchy to TransitionAnim

-Type the name of the Scene to transition to at Scene Name

Physical Components

We used 3 Adafruit solenoids as the key components for the physical movements in our experience. They were connected to a combined Stepper & DC Motor HAT and Raspberry Pi. The 3 solenoids were used for the motions of stabbing, tickling, sound and haptic effects. We added a furry pipe-cleaner on the tickling movement solenoid to mimic the tickling experience closer to a tickling feather.

Connecting Visual & Physical

We used Spacebrew to connect the Unity animation, with the Raspberry Pi controlling the HAT and solenoids. Spacebrew allowed them to send signals to each other which finish the loop of below:

Unity files ---> Spacebrew ---> Raspberry Pi

Our Spacebrew was run locally from the laptop running Unity; the Raspberry Pi was then connected to Spacebrew as a publisher and subscriber. We further used two other laptops for the “crowd control” part of the experience.

Challenges

The physical wiring setup on the Pi, building of virtual elements in Unity and the Spacebrew connections were relatively straightforward. Combining all the files (especially different Unity animations) from different laptops was very challenging and time-consuming: we would recommend more time and to start early for combining files.

Tips:

Don’t copy assets in Unity, always use import/export package if you have to work in different laptops beforehand. Ideally it should always just use one laptop to start. Start testing early with the physical setup allows you to tune a more accurate position for better experience. Unplug the Pi, and only connect it just before your experiment starts. If it is plugged in for a long time to Spacebrew, sometimes the Pi goes offline by itself unpredictably.

Finally, we completed the whole setup on Thursday and Friday of the week.

Key Learnings

  • Consent can be used to reduce vulnerability, but can also increase vulnerability
  • Vulnerability enhances a participant’s immersion into an immersive experience
  • The physical environmental factors (e.g. black “wall”, position of the hand which forces the body in an awkward posture, knife sound and haptic, etc.) play an important part in creating an immersive experience.
  • Negative feelings (e.g. being “stabbed” or controlled by the crowd) stood out more for participants, than neutral/positive feelings (e.g. being “tickled”)
Clone this wiki locally