Virtual Reality, Real Virtues, and Augmented Norms and Laws

Virtual Reality, Real Virtues, and Augmented Norms and Laws

Show Video

Welcome everybody and thank you for joining  us for this event which is part of Santa Clara   University's IT Ethics and Law Lecture Series  co-sponsored by the Markkula Center for Applied   Ethics and the High Tech Law Institute. My name is  Irina Raicu and I'm the director of the internet   ethics program at the Markkula Center for Applied  Ethics. Before introducing our speakers I just   want to remind you that you can add your questions  at any point in the q and a section and we will   try to get to as many of them as possible. We  are recording this event and plan to make it  

available soon. I also want to take a minute to  acknowledge that we are holding this conversation   while massive battles and humanitarian disasters  are taking place in the real unaugmented world the   invasion of ukraine might prompt us to consider  the ways in which vr and ar will or already do   play a role in warfare and military training as  well as in efforts to help people move beyond   traumas or to help them understand the experiences  of others living lives very different from their   own our speakers today are philosopher Erick  Ramirez and attorney Brittan Heller. Brittan   hHeller is an attorney who specializes in advising  companies on issues such as privacy freedom of   expression content moderation civic engagement  cyber hate and hate speech and online extremism   she was the founding director of the Center on  Technology and Society for the Anti-Defamation   League and has collaborated with major online  platforms and gaming companies to combat cyber   hate. She has also produced and launched new  technology for good in mediums including AR,   VR and XR heller previously worked for the  international criminal court and the U.S.   Department of Justice's criminal division  prosecuting grave human rights violations.  

Erick Ramirez is an associate professor in  Santa Clara University's philosophy department   and the author of the book titled "The Ethics of  Virtual and Augmented Reality: Building Worlds"   published in 2021. He is interested in all aspects  of moral psychology and for the past several   years his research has centered on exploring  interdisciplinary issues involving the ethics of   developing and using virtual reality technologies  he is especially interested in the ethics of   using VR for experiments empathy enhancement and  behavioral modification and has developed virtual   reality modules of classic thought experiments  i'm going to ask Erick to take it away for us   and and thanks for that introduction marina so  i'll share my screen uh with some slides that   i'll use to structure and then i'll i'll stop  once i'm done and what i'm going to spend most   of my time doing today is just start with  a really brief introduction into just what   extended reality is and then focus most of what  i'm going to say about what i think are pressing   now style issues that we need to deal with as this  technology is being developed commercialized as   um met us is spending hundreds of millions  if not billions of dollars to lay down the   infrastructure for the future metaverse i think  there are some things we need to talk about now   and then i'll end with what i think are  longer term questions about the effects and   maybe social social questions that that these  technologies are going to force us to answer   and to begin with just what is extended reality  um for me i think of extended reality as a as a   large family of technologies most of them current  but not not all and what they share what they all   share in common is just that they are different  ways of talking about overlaying immersive content   onto your experience and and it's the  immersive part that i think makes these   technologies different from the screen you're  using to watch this webinar on right which is   also overlaying content onto your experience  so really i mean going far going as far back as   the the late 50s things like this mining  tool that was called the mascot these kinds   of technologies have the way this work there's a  camera up here on this machine and there's a it's   feeding the visuals into the user here the user  controls the robot with the two hand devices here   and it was things like the mascot that actually  first started getting psychologists interested in   the concept that we now call presence right  this weird experience that users would have   about being somewhere they're not right so people  controlling this mascot would feel like they were   wherever the mascot was as opposed to sitting  in a room and as these kinds of technologies   got developed right like it started taking into  account some really interesting opportunities like   combining modalities right this is combining  a visual stimulus with a sense right with an   olfactory one the sensorama and by the time we get  to the 80s i think we get something that's really   quite modern looking quite contemporary looking  in terms of how head-mounted displays haptic   gloves things like that as control interfaces  for things like training the nasa work use this   specifically for training astronauts and so when  we talk about virtual reality i really think or   extended reality there are lots of ways of talking  about this and for me the only real difference is   how much of our experience is being replaced  with something simulated if you talk about cave   style vr systems where you enter into a room and  everything is a digital projection or traditional   head mounted display style vr what those have in  common is really it's replacing almost everything   with a simulated experience whereas when you think  of things like pokemon go which is an augmented   reality game right it's just replacing a small  part of your field with augmented with simulated   content the microsoft hololens can do a lot more  than that and so really what i'm asking questions   about are what are some issues that these kinds  of technologies are forcing us to deal with   both now and then and then into the future and  so for me when i think about um what a lot of my   research has been focused on is about thinking  about the psychology of simulated experience   and i think what we learn about the psychology  of simulated experience is that under the right   conditions simulated experiences can feel like  real experiences and that that adds an obvious   ethical dimension to the development and use  of these simulations what i'm showing you here   is uh on the left if you're familiar with a  psychologist by the name of stanley milgram he   did some really important but infamous experiments  in the late 60s about authority and obedience   that involved just asking people to shock  subjects if you have heard of these you know   that you can't replicate stanley milgram's  experiments today because they're found to   be unethical the risks and subjective trauma they  imposed on subjects were seen as not outweighing   uh not being outweighed by any benefits  that subjects got out of participation   in these studies in fact it was experiments like  this that led to the creation of a whole set of   protections and guidelines now on human subjects  research what we call institutional review boards   that have to approve all this research  what i think is interesting is that when   milgram's experiment was replicated in virtual  reality one of the things that we saw was a very   similar kind of subjective experience on the  part of subjects they were experiencing anxiety   subjective trauma to in ways that actually shocked  the experimenter slater himself did not expect   that result and so one thing i think we learn from  the study of virtual experience and the psychology   of virtual experience is we need a lot stronger  protection in terms of how subjects can respond to   simulated content i do think the immersive nature  of virtual reality makes it very different than   the exact same content experience on a flat screen  for example and so we've already had examples of i   think consumer corporations accidentally producing  traumatizing content in vr because they weren't as   familiar with these the these design questions  and i do think we need to have much stronger   protections at the level of institutional review  boards obviously this experiment carried through   somebody approved it because simulated  experiences weren't understood as being   harmful or risky enough to prevent this this  study from going through and i think that was a   mistake i also think that we need to get better at  acknowledging the limitations of virtual reality   and extended reality experiences so everything  that i'm showing you right now is really just   different simulations all of which are they're  aiming to give you a kind of experience right the   experience of being a cow the experience of being  pregnant uh the experience of uh of anti-black   racism this puts you into the body of a black man  to experience racism this is a simulation that was   created by uh director alejandro inarito which  is meant to give you the experience of migrating   into the us but without documents and i think  that we have really good reasons for thinking   that virtual reality can't actually do this that  it can't give you these kinds of experiences   and that if we use this technology to give  to make people think that they are having   these experiences we're doing them a disservice  it's a form of i think an ethical manipulation   and so we need to get better at acknowledging what  the technologies are really good for but also what   they can't do the other thing that i want to just  mention briefly because i know britain you're   going to talk a bit more about data privacy  and some of the issues about biometric data   is to think about what these things are and what  kinds of data they can collect if you look at an   hmd like this you'll notice immediately it's  got cameras right it's got a lot of cameras   on it these cameras can record obviously not  just the room you're in and who might be in   it but also other kinds of intimate information  about you not just where you happen to be sitting   how tall you are but also where you're looking  right so uh some of these are equipped with eye   tracking technology that can be used to figure  out what you're looking at within a simulation   uh the the hmds at our own lab in santa clara can  also measure not just where you're looking but um   the uh the diameter of your pupils to also guess  at how much attention you're giving to the thing   you're looking at all things that you might think  are extremely valuable in an attention economy but   also really intimate forms of user biometric data  what these are used for ostensibly are things like   making new interfaces so i can track your hands  in vr with these external cameras so you don't   need clunky handsets but again that's i think  a kind of intimate kind of data that we need to   get better at protecting meta then facebook was  already experimenting with using these cameras   also just to read and track facial expressions so  this is tracking user facial expression to then   carry over the expression onto digital avatars  and here too i just think um and again britain   will say much more about this we need to be  not only more knowledgeable about what kinds   of biometric data these things can collect  but also how we protect user privacy with   uh while trying to make use of the things  i think these are good for or useful for   speaking very quickly about long-term challenges  i think that the long-term challenge that we're   going to face more than any other about this is  thinking about the self and in particular what i   think is a conflict between the way we normally  think about the self now as being embodied in a   physical way with how we might be embodied in an  augmented reality or metaverse environment and i   think we're already dealing with this problem  in a really low-level way by looking at how   people are responding to ar filters on things  like snapchat tick-tock and anything like that   is a form of augmented reality embodiment  and we're already seeing specific forms   of body dysmorphic disorder arising from  sometimes called snapchat dysphoria here right   people having a kind of mismatch between their  augmented reality self and their physical self   and when we look at the options for embodiment  that the metaverse is going to allow this is   just from an image from uh meta's horizons uh  metaverse space right all of these are people in   the metaverse look at the vast array of embodiment  options that they have that there's going to be   some interesting conflicts between augmented and  physical embodiment this is from a study that was   done five years ago on avatar customization this  is a user i think making exactly the point that   i think we need to be more interested in in real  life you're stuck with what you're born with but   in vr you can be what you truly feel like you are  inside notice the privileging of augmented reality   embodiment over physical embodiment i think this  conflict is going to be really important lead to   lots of questions about who should have control  over how people think about their embodiment in   ar how we keep track of people in in ar given  the forms of embodiment are possible questions   about ar overlays with respect to property rights  who who can overlay information over my house or   over public spaces i think are things that we  don't have a lot of framework for right now and   we're going to need to address as these things  become less and less strange when the metaverse   becomes less some odd thing and more like when we  talk about the internet i think we need to have   answers to these questions already in place and  i look forward to hearing more i look forward to   engaging with all of you that's the end of of uh  what i wanted to say and just acknowledge all the   people that have worked on this thanks markula  center arena britain everybody and uh i'll stop so before we jump to britain's portion i just  want one point of clarification can you talk   for a couple of seconds about what you mean  when you say keep track of people yeah social   embodiment uh when the reason i catch  that i put that as a long-term question   is i'm imagining that the metaverse becomes in a  way like the internet meaning it's a space that we   all have access to at any given point just either  by having a specific device that links us to it or   um if it's an augmented reality overlay it could  just be something we're wearing basically at all   times because all the interesting things about  social and political life are there right in   the way that it's hard to do this to have social  and political life without being on the internet   so in that sense i mean uh right now there's  a lot of cool evolutionary psychology about   how it is that when we booted up this this  meeting i recognized that it was britain   that i was looking at or how i recognize that it's  you arena and all of those presume a really tight   link between the self and the body and so i think  when we talk about augmented reality embodiment   all of these old psychological heuristics we use  to track people are gone right because you can   look and sound any way you you want and uh yeah i  think we need some way of keeping track of users   across spaces and we ideally using a system that  doesn't require a kind of corporate or state-based   surveillance system uh and and yeah i think  those are those are longer-term questions that we   we don't have great frameworks for right now we're  exploring i think you've heard me talk about this   before arena we're exploring all sorts of weird  answers uh like turning identity into an nft or   to think about turning identity into some kind  of correlated pattern of anonymous activity we   can track across spaces but but it's it's tough  to think about good ways of doing that that don't   require like a database of like nft keys assigned  to a person or something so that brings us very   much to questions of privacy law and biometrics  law and a lot of things that i know britain has   been writing and talking about for a long time  now britain please go ahead thanks eric i just   wanted to say that in vr i'm a flying toaster so  uh shout out to all you windows 95 fans out there   so all right everyone i'm i'm gonna shout out  to the gamers and the nascar fans let's imagine   that you and i are playing a racing game in vr   i see this red mclaren and i get really  really excited my heart rate speeds up   and my skin gets a little bit moist and my  pupils dilate i i really really like this car   later on while i'm still in vr i start seeing  red cars that remind me of the race car i see   them they're being driven by someone who looks a  little bit like me i start receiving ads for auto   insurance in my social media feed i get targeted  ads about why now is a great time to get an auto   the type of information that my body gave off  when i experienced pleasure in looking at the   car is traceable by the current type of sensors  that we have in ar and vr and it used to just   be available in a lab but it's quickly becoming  available commercially the type of information   that i'm talking about where eric talked  about all the cameras in the head maps device   i had someone once describe a head-mounted  device to me as a polygraph of six canvas so   the type of information that you emit when you're  in virtual reality releases a digital exhaust so   this information gathered by sensors combined  with biological and anatomical unique identifiers   gave the video game company access to  potentially intimate information through   pupilometry you can actually tell things about  someone like who they're sexually attracted to   whether or not they're telling the truth and  whether or not they're likely to develop medical   ailments like schizophrenia parkinson's autism or  adhd and it picks up pre-clinical signs so things   that people may not even be aware that they have  the proclivity to develop themselves so while this   sounds like science fiction it's actually close  to present reality not virtual reality in may 2020   facebook oculus announced that it would start  putting advertisements in vr within five days   the pilot company called blasting vr canceled the  initiative and this move was seen to be a turning   point for the industry bringing one of social  media's most controversial features into a new   medium that inspires both idealism and alarm today  i'm going to bring points for you directed for   lawyers and companies and legislators about what  i think we need to know on at this tipping point   in the technology number one this is not social  media there's been nothing like this before   i'm a human rights lawyer who focuses on  technology and i'm very worried about the present   inability of of law and regulation to grapple  with these hardware and software-based challenges   this is because as eric alluded to what happens in  an immersive environment feels real it's actually   processed in your hippocampus in the same way  that memories are processed so don't think about   virtual reality in the same way you think about  scrolling through a facebook or a twitter feed   think about it as inviting someone into your  home having them sit next to you on the couch   and engaging with them one-on-one instead of  just reading their words across the screen   because of this i argue that we should have  a higher duty of care for this technology   with greater awareness of issues  related to consent privacy and   human rights some people have have begun  to term this neuro rights or mental privacy two vr is a different hardware because  of the way immersive sensors work   so as as there's widespread adoption of ar and vr  and it's becoming more and more imminent so does   the potential for massive data collection at scale  even more so than your smartphone vr captures   a wealth of information about you so think about  what eric was talking about what is needed to   orient you in a digital space the sensors can  capture your precise head and hand motions they   take pictures of your surroundings through  tracking cameras microphone audio is picked   up through voice command systems and eye  tracking determines what you're focusing   on and how intently jeremy balinson of  stanford recently produced some research   that discovered that within 30 minutes of vr  content you could uniquely identify an individual   conceptions of personal identifying  information in vr ar look completely   different than what legislators have previously  thought about you can also uniquely identify   somebody by the tilt of their head and the  way they point so no disco dancing in vr   future headsets are going to offer more  intimate details like eye tracking which   are going to offer incredibly precise metrics  about what captures your attention in a vr space   and you're already seeing this start to  be monetized in web based applications   moviepass recently relaunched and announced that  users could watch ads for microcredits but the ads   would pause if the iscanning algorithm determined  that a user wasn't paying attention to them   immersive technology is unique in that it  not only tracks your reaction to stimuli   in a way that these sensors need to function but  it also creates a record of the stimuli itself   and this is very valuable information  to advertisers and third parties   especially when there's not a  clear route to monetization for   industry which leads me to number three  existing biometrics law won't protect us many people even fellow lawyers are  surprised to learn that biometrics   law may not cover these kind of risks recent  lawsuits like one filed by the state of texas   claim that meta violated user privacy  through using a facial recognition algorithm   this makes sense because biometrics is centered  around the concept of your identity but many vr   users log into their their now meta quest  with their facebook social media accounts the company has gone back and forth three  times at this point about whether or not you   can use your facebook id to log into your oculus  there are different terms of service governing   each one of these legal regimes the legacy  people the people who sign up for an account   and the people who basically can use either  they're trying to standardize this now but   regardless of which regime you use you have to  have a verifiable billing address to download   immersive content this is like the early  days of the internet your identity is not   necessarily what's at issue here to me it's your  thoughts and your preferences it's your privacy   xr devices take biometric data and  make it about personal data collection   so this takes it to a different level  by combining existing data streams   on people's demonstrated preferences likes and  dislikes with anatomical data on an ongoing basis while pii and biometric data and the risks  that that entails are often discussed   this leads itself to a deeper the way that  this could create deeper user profiles is   not often discussed bio biometric data may  in fact be the window into a user's most   private thoughts and their  involuntary reactions and feelings   so to accommodate for this i've proposed  a concept called biometric psychography   this concept captures the level of intimate  intimate knowledge that companies will be able   to collect on individuals using a combination  of their biometric data and their psychographic   data and not to term from advertising meaning  your likes your dislikes and your preferences   so biometric psychography is the  behavioral and anatomical information   used to identify or measure a  person's reaction to stimuli over time   which provides insights into a person's  mental physical and emotional state as well   as their interests to summarize it and normal  people speak it's the like button on steroids xr headsets will not only be able to  track what people pay attention to   but for how long with what intensity and what  their specific emotional response to stimuli   is and this can be gleaned through a  combination of pupil dilation micro   expressions and facial muscle muscles and in some  cases galvanic skin responses eegs emgs and ecg's   so four this matters because biometric laws are  designed to protect identity and not privacy so   again the main issues around xr are different  it's consent and privacy how can you consent   when the data collected on users will  be involuntary it is your unconscious   and uncontrollable biological responses that  are going to be transformed into data points   and so users don't no longer actively participate  in a data collection process it's very it's their   very reaction to stimuli which would be the  data you can't control your pupil dilation   you can't control your heart rate you can't  control if you start to sweat a little bit when um   when you see things that stress you out  additionally there are issues related to   bystander consent that have to be grappled with  there is no norm for recording in public or   even for signaling that you want to opt out  of being a subject of someone else's recording   the industry has tried to put lights in smart  classes when recording but civil liberties   group have pushed back saying smart classes these  lights can be covered up or disabled very easily so um i know i'm running short on time  so i'm gonna speed up a little bit um   so number five i think we need to define industry  best practices and very quickly there are four   things one you can press companies about their  monetization schemes and in particular their   ads policy you have to understand that ads  don't look like billboards metaphors they're   branded experiences they're that are entertaining  many of us seek them out we even pay for them   direct you go to the jurassic park experience to  feed a dinosaur but it's actually an ad to get you   to see the next movie and what better way to to  do that than to have you feed the dinosaur right   it's it's more persuasive than a billboard and  for a human rights center report or approach i   would press for bodily feedback to not be used  for commercial purposes based on the inability   to meaningfully consent you should look to  on-device storage as a best practice for privacy   and for many people concerned about hacking  surveillance or inappropriate oversight this is   the answer you're looking for it gets complicated  because there's limited storage capacity in the   hardware right now and many companies are going  to start looking to cloud storage as a backup   um this is going to make the hardware and the  privacy demands and the storage demands go   head-to-head as users demand longer recordings  and more features in a limited hardware skate   number three you need to involve engineers  in the discussion bluntly there's only only   so much memory you can put in a head mounted  device there's only so much light that you can   emit externally before the camera can't function  consumers need to understand not only how their   device works but why it works the way it does and  then they can ask for better things and three and   we can get into this in the questions a little  more is we need to design these devices for   all people new research by jessica outlaw of the  extended mind shows that disabled populations are   some of the earliest adopters of xr technology yet  fundamental controls for vision and vantage point   weren't integrated into oculus quest programming  until version 30. this means if i was in a   wheelchair i wouldn't be able to have the vantage  point of a standing person until july last july   it's ridiculous um non-adjustable interpupillary  distances disadvantaged women who on average have   a smaller and i would say prettier heads than  the average user that the hmv was designed for   the distance between your pupils is as important  when you're when you're wearing glasses as the um   as the lenses themselves so for many women  who weren't the uh didn't have the proportions   the original headsets were designed for it was  like they're putting on the wrong prescription   glasses which is why women reported getting  simulation sickness at higher rates than men   an mit researcher took an oculus go to nigeria  and found that straps snapped half of the time   that she tried to fit them over african subjects  hairstyles we can and we should do better so   those are the ethical issues that i am thinking  about and happy to discuss with you all   so i have so many questions but we're getting  already great questions from the audience and i   encourage you to add more first i should say that  if from now on you see me going around campus with   a tiara that says do not consent to being recorded  it's because of what britain said um there might   be a market for signaling devices coming up um  and i'm interested you know you had said we need   to design for um design them for all people  but there's definitely been pushback by some   who argue that they should not be used by children  um so i'm going to ask eric to just talk a little   bit about the use of the art by children and  what are some particular issues related to that   yeah so i mean on the one hand i think it's good  to note that even you know even the manufacturers   themselves right if you look at the oculus health  and safety handbook they tell you not to use it   if you're under 13. um and and i think there are  there are good reasons for being extra cautious   with children not just because you know the acm  code of ethics says that we should be or something   but i think yeah because i think the developmental  effects on children are still pretty unknown   and um one thing that i do think we do know  about general experience in virtual reality   at least if you're in there for let's say 20 or  30 minutes when you come out of that experience   you do tend to have for example as an adult much  higher dissociation and derealization of your   real life experiences as a result of having spent  some time in virtual reality and so i i do think   right now we don't yet know that much about how  this will affect certain kinds of developmental   uh milestones in children who are who have to  make that dissociation uh growth right they   have to be able to distinguish the the what's in  their imagination from what's really happening and   it's so i i think we in general should be extra  cautious about children in any technology but   in this one especially because its psychological  effects on adults are known well enough to think   it might cause developmental issues in children we  got to be extra careful uh yeah yeah and to add to   that recent studies that i've seen show that there  there could be some benefits but some there also   may be some harm to um kids spatial perception  because the way that you um the way that   that worlds are rendered in vr is not exactly the  same as as they are um you know in in neat space   and um in m-e-a-t so it's not certain whether  or not that that might kind of developmentally   impede children in that way as well um one of the  interesting questions and eric can kind of respond   to this as well is people call vr an empathy  machine but some of the research that i've   seen that's coming back on that based on what  you've said about disassociation can actually   question whether or not um some people were  put into an anti-racism related experience and   came back um actually more affirmed in their prior  beliefs because they felt they had known what it   was like to be someone else and it wasn't  that bad right um so so an awareness that   vr can take us and take us to the point of  relating to another person and another experience   to a point but it's not certain if if um  if all of the if all of the benefits will   be clinically proven when um when we know  more about the behavioral implications of   this in the long term eric um do you want to  say a little bit more about that yeah sure i i   mean it's gonna sound like it's gonna sound  like we colluded but we didn't uh i mean i   i i just i i agree with what uh britain  is saying i think one of the things that   you know it's getting getting plopped into a  well-designed immersive experience will change you   um and in some ways those changes can be really  good they can be desired they can be exactly   what we hoped would happen but i do think that  um especially for perspective taking empathy   simulations they're they're they're full of  problems we do know and this is also one of   those balance and lab results right is that  you can succumb to something they've called   the proteus effect and all that means is like  when you embody someone else in virtual reality   you'll start to adopt some of the stereotypical  behaviors you have associated with that kind of   person right so if you if you're if you're put  into einstein's body in virtual reality that will   trigger certain kinds of stereotype responses you  have associated with this age but also like other   kinds of mannerisms and it's not clear that that's  being done consciously it might just be happening   as a result of being embodied in that way but i  don't think that tells us anything about what it   was like to be einstein right and so for the for  the kinds of reasons that britain was mentioning i   think when we use vr for things like anti-bias  work it's got to be really really carefully   controlled to not give the kind of misinformation  that you now know what it's like i think there are   some ways of using virtual reality for anti-bias  interventions that don't have that problem   but but that kind of empathy route is is a bad  route uh though empathy is a loaded word in lots   of ways so we can unpack it if we need to but yeah  i i would agree so a question from the audience   i think for both of you maybe more directly  to eric do you see the ability to choose your   embodiment as potentially self-revelatory in ways  we have not contemplated and that creates a new   category of personal information yes and and i  think we don't know exactly what that will mean   in a way right uh so this is this is stuff  that i'm thinking about right now is about   you know when we think about all the freedoms  we have in terms of body modification um in in   this non-augmented space right there we have a  lot of freedom to modify it you've seen images   of people who take it about as far as i think we  can take it but we're always limited biologically   by what we can do to the body and in augmented  reality of the metaverse anyway you just there   are no limits on your form of embodiment right  you could be that giant robot you couldn't do that   in in this space meat space as it were right like  um you could become a dragon you can become i can   look just like you if i wanted to right and so  though the fact that there are no limits i think   is just a it's one of the many challenges that  we have in terms of things like certain forms of   privacy right and i do think that this this  question about deep faking somebody else's   physical appearance in in augmented reality  is something that we're we're not even exactly   sure how to regulate deep fakes right now in a 2d  you know dimensional sense and so in an immersive   context i'm not sure how to handle those but yes  they will be transformative i think we already see   as i tried to show you like we already see people  who are giving investing more in their augmented   reality forms of embodiment and their physical  ones and i don't even know if we have a good sort   of social moral framework for thinking through  those things yet um but they're there like we   have to so so britain you are revealing something  about yourself by making yourself a flying toaster   i am i am the brave little toaster but um i  i i what i think is it is interesting is how   i don't want to say fungible because it  brings up you know ptsd about nfts but um i think the concept of identity when you're  when you're considering all of these immersive   technologies is is one that we're really gonna  have to grapple with you know when when i go   to a restaurant in um in my daily life there  are parts of my identity that are necessary   for the transaction and there are other parts that  aren't like my um you know my my race or my gender   my religion aren't as relevant when i'm trying  to buy a hamburger but um making sure that my   credit card is attached to my identity which  is attached to my address um that that that   does take precedence so part of me started to  think about identity in ar vr being almost like   a closet of different skins that you can  put on for for different reasons because i   like in the same way like when i go to the doctor  they don't need to know what sports do i root for   so there's different types of information  that we foreground for different types of   person to person and commercial interactions  in meat space and i i don't see why that can't   be replicated in in a virtual space especially if  we have control over um visual representation and   even the laws of physics you know my first  time in vr is magical and i i could fly   it was um i just i remember the early  experiences being transformative and being   visceral really visceral there was a an  experience where you you jump off a building   and that is something that you really shouldn't  do in in in real life but it really felt like   falling off a building and um and so when people  talk about how you you really shouldn't take   the this is gaming you know it's just pixels  you should just turn it off when somebody's   harassing you or when something uncomfortable  happens it's um it's not it's playing with this   alchemy and your somatic self as well as your  identity as well as your intellect so it's it's   not just pixels on the screen so that brings  us to another good question from the audience   who says i am concerned about virtual sensory  inputs that trigger innate reflexes and the   possibility of changing the user's emotional  state either accidentally or deliberately um   for example an object moving fast from your  peripheral to central visual field will trigger   fear of a collision not necessarily consciously  so how do you see that potential because it's   so visceral um and because it's designed  by people who understand these things more   than the users do do you see a potential  for manipulation intentional or accidental eric do you want to take this one first i can jump  in after sure i mean the the the short answer is   yes um but i you know i i think what matters here  is the the intention behind the use of this kind   of stimulus right um for me and eric i know you  know this uh as sablemen but um i think one of the   one of the resounding successes for me in terms of  virtual reality has been therapeutic uses and and   in particular things like virtual reality exposure  therapy or um other forms of therapy related uses   of virtual reality can use exactly these kinds of  things right knowing that you can trigger certain   kinds of responses in people that they might need  to therapeutically work on helping to manage and   control and so on can be really helpful um at  least the the the meta-analyses suggest that it's   it's it's almost as good as traditional exposure  therapy definitely better than imaginative   exposure therapy so there's something there that  i think you can you can harness the ability to   trigger involuntary responses for good but but  you can also do this to manipulate people right   uh for for nudge-based manipulation to get them  to prefer something on a shelf in a certain way   if you've got an augmented reality layer in a  store that can make certain objects become more   likely to be attended to than others than then  we might be manipulating in in ways that would   at least require more justification but but  i think it it works depends for me it's it's   i'm not a kantian like all the way down but i  for me it's intent really matters like what are   we using the technology to do but absolutely  you can trigger innate uh reflexive behavior   yeah uh i'm gonna connect this question if you  don't mind with another question from the person   about um are we going to see traditional  online harms like this information and hate   speech manifesting in um in vr and the answer is  absolutely like and and so i think that that's   predicated by the question you're talking about  the this technology is very persuasive there's   evidence that it's more cognitively persuasive  than reading than being taught one-on-one even so   we we do need to understand how it works and  create now i'm getting on my soapbox but create   terms of service for platforms that recognize  the differences between this and social media   um i i can give you a good example of this  that that i think is is really profound um one how do you translate a spam policy into  uh ar vr i've been doodling on that for a few   days i think i came up with an answer but i'm not  gonna tell you it i want you to think about like   how do what does spam look like in a  spatial computing immersive environment   and then how do you create behavioral  and systemic interventions to stop that   two there's a lot of debate about whether or not  you want something to be it whether presence so   that feeling like you're really there is increased  by um by photorealism in vra and there are   some very high-end super cool things like the  vargo headset that look almost like optically   replicated pass through vr they're they make you  really feel like you're driving that mclaren um   from your from your dining room table and  like you're in monaco it's awesome but   the research says that's um the most effective  treatments for ptsd are ones that are actually   not photorealistic they're representational people  coming back from afghanistan and iraq reported   night raids were some of the most stressful  experiences that they had breaking into a building   and not knowing who was going to be there if they  were going to be a threat or if they were going   to actually be a danger being harmed themselves  so harming bystanders or harming the military   personnel so that is what's replicated that  experience is replicated in in ptsd trainings and   they keep it intentionally vague and shaded and  what happens is that the brain fills in the gaps so if the brain feels in the gaps and you make  and it feels real to you because you you see it   as real based on your personal experience um  how do you how does that translate over to um   people using using weapons in vr is a photo  gun more poignant than a roger rabbit style gun   i would argue based on the research that the  glorification of violence policies like you   see in social media shouldn't take into account  whether or not it actually looks like a real gun   but they should respond to the act of violence  like your body will like your mind will and not   not whether it looks like a picture of a weapon  so i think you guys have already touched on this   but maybe if you have others what would you say  are the most positive uses of this brand new   technology that you've seen where does it seem  to work better than other tools we had before   and you mentioned that the therapeutic  ones are there others that we should be   thinking about i think this talk i think  someone brought this up but training   so things that are expensive or dangerous or  novel like going to the going in a submarine   to the bottom of the ocean or cutting  into somebody as a surgeon or um gosh it's   or experiencing black friday sales with a mob of  people trying to get the latest toy um allowing   people to kind of practice and get their reflexes  and experience without having to put themselves   or others in harm's way that i think is great  um i think artistic representation is really   fascinating fabulous um i also think other medical  uh interventions where one experience i did um before the um for science and art too i put  my hundred year old grandmother in a headset   and basically put her in into the blue  and to the bottom of the sea and she   said that it was one of the most magical  experiences of her life and she just sort of enlivened in a way i hadn't seen probably 20  years i didn't experience that was actually a   went you went into this this  little world it was called cool   and you you figure out that you can shoot from  your fingers rainbow trout at river otters and   they're animated so it's like you're in a nintendo  type game and you shoot the fish at the ritter out   rib rotters and if you hit and if if you hit the  river otter with the fish they turn rainbow colors   and you float down the river you feel like you're  floating you go through a cave you go through a   sakura bloom shower you go through all right all  right it sounds like you could go for a long time but okay but eventually i really really enjoyed  it yeah the punchline is is it's pain mitigation   software and it was clinically proven to last  twice as long as opioids because when you are   in virtue when you are in happy otter land you are  not in your somatic self experiencing pain so they   gave it to people and asked them to think  back to that experience and how their pain   was treated more effectively than drugs oh now i  feel bad that i stopped you so that's all right   it's all right i just get excited about happy  outer land so i appreciate you keeping me on track   eric are there others i mean those are pretty  compelling examples i i also put my parents   through the blue this this last week oh you did um  to to similar actually the whale one was scary but   that makes sense um but yeah no i i in general  actually not just in general i agree with uh   britain i think for me the the therapeutic angles  are are just it's you know in a way it shouldn't   be surprising i guess if you think about the  fact that if this is really just about talking   about giving people experiences they're going  to treat as if they were real then it of course   it's going to have similar kinds of responses  as real experiences but it's fascinating and   and to me awesome to see it actually working that  way um you know the the this they've even done   um the the the standard treatment for things like  phantom limb pain right the mirror box uh style   thing you can do in vr and get results that work  therapeutically i think there's there's a lot of   good uses of vr for that kind of thing and the  fact that you can do it at scale right you don't   need to hire all of these um people to come and  put on the production because it just exists and   can be delivered to anybody with the hardware  anyway um and i i also just think the aesthetic   the aesthetic options are they might not be the  you know the the the most i don't know what you   want to call them they they're not to me like  saying oh this person now literally feels better   from phantom limb pain but i think it's it's  an untapped realm of expression that we're just   developing a language for right um just as i think  just as it took some time to develop techniques   and language for how to put film stories together  people are just kind of learning how to use a new   language to for artistic expression in virtual  reality and so i'm excited to see what that means   i think um an augmented reality layer is going to  change fashion in ways we literally can't predict   um because it gives a new element a new degree  of expression that doesn't exist right now   it'll be interesting if nothing else so so let  me take us uh far from fashion for a second um   at the markle center we have this thing called  the um ethical toolkit design toolkit and one   of the tools is called think of the terrible  people so even as you guys are talking about   all these uses i'm thinking this could be used to  enhance torture right and and and scale it in ways   we couldn't have before as well right so it goes  back i think very strongly to your point about   intent and about putting guardrails and about  really helping people understand what this can do   and what it is and what it isn't so britain you  have written about some first steps that you think   need to be taken right now you talked about  uh data localization as being important you   talked about potentially an industry-wide code of  conduct modeled on the un guiding principles for   business and human rights can you can you give us  sort of a list of what you think needs to be done   right now because we're running out of  time really fast okay as fast as i can   can get it out there one there are no best  practices um every time a client comes to me or   or an internet safety organization concerned  about this they're like what is the best thing   to do and i'm like nobody knows yet so i i think  it would be great for companies to actually   combine their research and to make some decisions  about what consent is going to look like what what   privacy is going to mean in a spatial computing  environment um two there's no standardized   physical vocabulary for this hardware um when  something bad happens to you or me on the way   home we know to call 9-1-1 imagine if 9-1-1 was a  different number in every city you drove through   you wouldn't be able to access emergency services  there are there are protections for users when   things go wrong in vr but every interface is  different and there's there's not a standard way   to to signal that you need help um three i i use  the u.n guiding principles for business and human  

rights because it's about a protect respect remedy  framework it gives different responsibilities to   government and to industry and both are designed  to protect um to create remedies and it's also um   a consensus-based international standard  it's 10 years old at this point so looking   at looking at what responsibility companies are  going to have for their product for their users   and for the impact the unanticipated impacts  of this will be very important going forward   and eric i'm going to take us a little bit  into virtue ethics just to wrap up what   should individual developers and organizations  that aspire to be ethical in their development   and deployment of ar and vr do or avoid  doing in order to live up to that aspiration   yeah and i think this is actually going to tie  back to the toolkit in a lot of ways right i mean   um there are some easy ones things like honesty  and humility right this is what i was trying   to get at earlier when i was talking about  acknowledging the limitations of the technology   right so don't sell it as doing something that  it's not capable of delivering because of the   i think just the fallout that comes from that  in terms of like empathy and perspective taking   simulation um compassion which i would just say  is is a form of expanding the ethical circle right   which is to think about not just the the intended  user base but all who might be affected so even   the even what does it think of the terrible people  is that what it is so think of the terrible people   right um not only people who would misuse it which  to me is is one of the i think i saw a question   through the q a that was kind of about this when  i was you know one of the things i worry about   is augmented reality embodiment it explodes choice  right it makes it lets you express yourself in any   way you possibly want to but that means right that  i can not only deep fake somebody else's identity   i can uh engage in digital blackface i can do a  lot of things if i'm empowered to choose how i   look to other people that we we might want to  regulate anyway or at least might want to try   and control or limit but that itself then calls  up its own regulatory questions who should limit   who should be empowered to how do we track users  in this way i think they're just um there are so   many questions that need to be addressed about how  things are gonna work in this space either if it's   one platform like like uh horizons or if it's many  different ones that we float between um i i think   it's it's going to be a lot of work but but yeah  courage to acknowledge limitations but also to   say no to products that i think might have clearly  foreseeable harms or misuses is one of the hardest   things to to do is to is to stop the production  of something for ethical risk reasons anyway so i   heard honesty and humility and courage compassion  and compassion um britain would you add any others i really like all of those so i'm i i can't  add anything better just want to make sure that   we're not just leaving ethics for the ethicist  because that's also something we say all the time   ethical those are things that we all of us make  in our daily lives all the time i guess i would   add prudence as a virtue that seems to come into  play here and i think uh one group i left out um   but that we've been talking about implicitly in  our conversation are regulators right i mentioned   organizations and developers but what regulators  need to do and think about right now is also   really important maybe grace as well because the  hardware is not solidified it's still a nascent   industry and there is going to be missteps but  that doesn't mean we should give up on it all   yeah and i think um you know one thing that we  haven't captured so maybe we'll leave that sort of   vague at the end but but that kind of magical  quality that you both were describing you know and   inviting you know people you care about to strap  these things on because you knew that they would   have those feelings um what's the virtue term for  that right i mean there's wonder and creativity um   very much as well so we are at time and i want  to thank very much our speakers for wonderful   conversation and thank all of you and apologize  to all of you for all of the questions we didn't   get to i think we could have you know at least  an annual event talking about arvr ethics and we   would have new questions all the time thank you  again for joining us thanks everybody thank you you

2022-03-22 05:27

Show Video

Other news