XR Access @ MIT Reality Hack: XR Accessibility Workshop

XR Access @ MIT Reality Hack: XR Accessibility Workshop

Show Video

People in the at Mit. Can you hear me? All right? Yes, yes, all right, only smiling faces. Good. Okay, Well, thank you. everyone so much for attending today. I'm really pleased to be able to have us bring you this Xr accessibility workshop to this the Mit reality hack 2022. We're really excited to be able to talk to this about you talk to this talk to you about this today. Wow! I need more coffee and you know we found that accessibility is a tremendously important and valuable thing to hack on.

And here we're going to tell you why so first to introduce our speakers. My name is Dylan Fox. I'm. The coordination and engagement team lead at Xr access. Hi! My name is Ron Tyler. I am the workstream leader for the inclusive design for Xr. Workstream at Xr access great miles. so if we have see working right?

Okay. Well, maybe i'll introduce miles miles is the artistic director and accessibility experience designer at Cyma Space, which is an amazing organization that's dedicated to helping folks with deafness. In her hearing loss participate equally in events. and Roland. Yes. I mean Roland Dugois I'm the works room leader for accessible development ofics are the Adx. Our workstream at Xr axis and i'm here for any kind of questions in person. So yeah, you can reach me during the hack for any kind of additional info. If this hour is too short, because there's a lot to cover all right, Thanks very much.

So since the hour isn't d short let's hop into it. So tell you what we're gonna tell you today's presentation. We're going to talk about the xr access initiative we're gonna talk about what is xr accessibility. And why you should care about it Talking about hacking for accessibility and some Xr accessibility resources that will help you do just that. I'm being told I need to spotlight michelle let me spot my shell there. We go And yeah, just keep me updated in the chat on who needs the spotlight.

All right. So next up, remove trout spotline. Okay, Who is gone somewhere? All right tech tech management catalog it. So let me tell you about the Xr Act initiative. So the extra access initiative is a nonprofit started by Cornell Tech Yahoo and Pete's the partnership for employment and accessible technologies, and we are a community that engages and connects X, our creators and people with disabilities earned to make Xr inclusive of everyone regardless of ability.

Our goal is really to modernize, innovate, and expand the Xr technologies, content and assistive attack by promoting inclusive design in a diverse community that connects stakeholders, catalyzes shared and sustained action and provides valuable informative resources. Basically, we love Xr, and we want to make sure everyone can use it. So our vision is multifold. it's partially to have inclusive design and accessibility, as really one of the core parts of X or creation. We want to make sure that accessibility is part of that minimum viable product.

And that's features like multimodal inputs and outputs and accessibility of content are really standard parts of Xr design. I'm being pinged in the chat here to say I need to add host privileges let me give that to let's see. Is that Miles make coast bam? apologies for the tech difficulties.

Everyone. Hybrid events are tricky the best of times. So so our vision. Secondarily, we want to make sure that resources at Xrx accessibility are widespread wherever extra technologies are created and updated. Frequently with the latest findings. for research we want to make sure that people with cognitive physical and sensory disabilities are a part of this future right?

We don't want to make things for other people We want to make things with the people, with disabilities that are going to be affected by this, and make sure that they have parts to play as designers as developers and as founders in some of the amazing tech that uses Xr. And finally, we want to make sure that we capture not just kind of disabilities, but also all facets of intersectionality. We know that things like age, ethnicity sexual orientation, nationality, gender identity, and socioeconomic status can all play a part in people's ability to access. Xr. and we want to make sure that those are taking them into account as well, and that we have a really fully fledged approach to making Xr accessible. So some of our core values we at x or access want to be efficient and useful, and evidence-based.

We don't want to be kind of pie in the sky you know, waving a flagrant ideals. We want to have things that are really actually on the ground helping people and making sure that we're hitting those kind of bits of crunch that really let us make this stuff accessible in a real way. We want to be approachable. We want to be user-centered, and we want to be nimble and adaptable.

So keeping a focus on people with disabilities and their user needs, and adapting to the changing landscape of Xr as as it updates, which you know, it does quite rapidly. As i'm. Sure, you all aware and finally our style is that of a collaborator. We want to be a catalyzer of of all of the different people out there working on the stuff. You know the W. Threec. for example, is a great standards organization We don't want to compete with them and make more standards that are conflict. We want to help them. Take those standards, make sure that those are, you know, real, based on real user needs, and then connect those to all the other people that will need to use them in order to actually make this stuff accessible. And Our stakeholders are really wide and varied you know we're trying to make sure we connect to everybody that needs to have a part in making.

X are accessible. Whether that's research is an end users app and content creators. Platform owners and trade associations, policymakers and educators or consultants and employers. You know. there's xr is something that is going to be really a part of tons of parts of our society, and we want to make sure that across all of those across the the end users across the people at the top everyone in between that everyone knows what are the challenges with regards to making things possible, and are doing their part to solve them. I also want to quickly give some additional thanks here, first to the W. Threec.

And immersive captions. Community Group who has been amazing at providing suggestions for some of the the resources You'll see later on to Sima Space, which is myels's foundation again. Amazing organization, devoted to deaf equity, and of course, to the reality hack. And you know all of you for being here, and the organizers for setting up this awesome opportunity to to use Xr for good. So with that, I'm going to pass It to Wren to talk about what is Xr accessibility in the first place? All right. Thank you, Dylan, and thank you, everyone for being here. So what is Xr accessibility? And more importantly, why should you care so? First to define Xr. accessibility, we need to quickly define Xr. And then define accessibility. So we're all here so odds.

Are we all know what extended reality is which is virtual reality in which digital leave gendered immersive experience or access via a head mounted display or vo web browser. Then there's augmented reality in which digitally augmented. Experiences are rendered as an overlay on the users field of view, and accessed by either a head mounted display or your screen.

And then you have mixed reality where there is a merge of the physical and the digital world to produce experiences where physical and digital objects and users coexist in threed space. and actually augmented you wouldn't have the hmd so my apology there. So an altogether they make Xr. And on this slide we have different examples of what virtual reality augmented reality and mixed reality. Look like. So next slide, please I can't see yeah there we go. I have to like split screen. so defining accessibility. if something is accessible, that means that anyone can use it regardless of ability. So now, when a lot of people think about accessibility, they think only about access for people with permanent disabilities, you know whether they use a wheelchair or they're blind, or they're deaf and so on but one of the reasons why accessibility is so important is that disability comes in all forms not just permanent, but they can be temporary and situational as well.

The accessibility adjustment. you make for something more accessible to someone who's deaf is going to be more accessible to someone with an ear infection, or say you're you know somewhere really loud and you really want to watch Netflix. but you can't hear anything closed captionings are your friend. Next slide, please. Okay. So So here we're going to go over some like really quick universal accessibility basics in design. So for vision you could see on the left here, we have like an example of what's a bad font to use, and what's a good font to use. So we generally go with sans free fonts, because they are the easiest to read.

Stylized fonts. should be used sparingly because a lot of people, especially with low vision, will have difficulty recognizing the shapes of letters. and so they won't even know that they're reading anything and that it's not just images, then below that, we have examples of what would be painful contrast between text and background, and then unreadable contrast, because the contrast is too low. Generally speaking, you also want to keep your font size to a minimum of 12 point, which is the equivalent of 16 Px smaller than that. It's going to be really difficult for people to read when you have any kind of error message, or what have you don't just use colors as the only means of conveying a message.

If you notice the other image on the left, it says Upload file, and it has a little exclamation point to draw your attention, that that really needs to be taken a look like and add all descriptive Alt text to all your images as in here, you see on the right the example of the the dog here saying, this is me pretending to be interested in underneath it says, And this is me not pretending. And it basically tells you a guy dog looks up attentively, saying such and such. And the last thing is, when you deal with screen readers, you have to make sure that everything is labeled properly, and the interface elements are structured that way. Screen. Reader users can navigate the size the site easily, and not have to worry about being unable to complete certain actions or get stuck places. Next slide, please. Thank you. So universal accessibility basics for hearing the big one is the importance of closed captions, you know.

Cat't be stressed enough and to do them well so caption should be customizable. Let the user set the size the font in the background please don't use all caps. It's the most difficult to read. Use sentence case like you Would any regular sentence don't abstract the important content.

So typically closed. captions are placed in the lower center of the screen. But they shouldn't be moved when important visual elements, like say, a speaker's name is next when they're like talking in a documentary whatever you know, appears, in the video at which point you would move those captions towards the top. and Then you can move them back down when it's clear. Include speaker labels When people are talking, so if there's a character name, for example, you know their name would be there, you know, followed by a colon and then, whatever they're saying which is especially important, if someone is speaking off camera, make sure that your captions are in sync with what is happening on screen.

You know you don't want the text to be popping up after something, or even before something is happening, and finally provide a transcript. When you have captions, you know, on available so that way people can at least, you know, read what's going on. The other thing you want to do is provide sound settings. such as volume Sliders and mono audio, because that's very helpful for people with hearing impairments to adjust. You know just how loud they need it. to be next slide please no, you're already there.

So here we have universal, excessively basics for more physical things, dexterity and mobility. So make sure that your applications especially like your desktop. They can be navigated. using your keyboard, using logical focus order.

So that's basically you hit your tab key and it'll pop it up in focus. On the left. You have an example of what would be, you know, Tab, order where it should all be going. When you tab through, you want to have clickable and capable targets large enough to account for lack of precision.

So in this other middle image here you show the touch target so it's showing you the actual size of the image. But then it's surrounded by an interactive area which means you know, if you don't tap it specifically in that image, you know, if you're within that radius, it'll still activate you also want to offer other interfaces. for example, voice interface is really helpful. Next slide, please. Okay. Universal accessibility for cognitive. So too much information at once is overwhelming, and causes cognitive overload which can lead to anxiety and panic attacks. In some people, and an inability to process information altogether. So it is very important to provide clear instructions and error messages.

We know when something goes wrong and explain. You know what can be done to fix the problem. Good tutorials really reduce anxiety and confusion. Have help files and tool tips easily accessible to review as needed. So the example in the left you have a tool tip where you click on it. You'll get more information.

You also have a learn more link on the thing where you can click again to get more information. So you are not overwhelmed by a ton of content at once. Then be sure to label your icons, especially those that are non-standard.

And then again, here, you'll see that they have the little the icon in the pop up, and they show you next, please. Okay, So sensory things a big one is animation. an option. To reduce or turn off. Animation should be offered, because some forms of animation, particularly parallax, can cause dizziness, nausea, and even trigger migraines, and some people that require days of bed rest for recovery. So also we want to worry about things like epilepsy. So content that requires multiple flashes per second should not include 3 more than 3 under this man of a second, because that can cause seizures in some people. Please avoid that content, if possible. Next slide, please. Thank you. So if you want to learn more about Julio accessibility, check out, teach accesses, study away program.

You can learn straight away from accessibility. Professionals at Google, Meta, and other companies applied by March, the third at Teach, access, dot, org slash, study hyphen away. That night. So what's the right Okay, So why is xr accessibility important? Well fast growth. the gap between technology, creation and technology inclusion, you know.

Close that gap now, and avoid loss, and you have equitable access to People with disabilities need accessibility in order to, you know land jobs and get better education. And you know, health care that's necessarily and of course entertainment and games, and other social things. 2. So then curb effect. So when you do that, you know it makes everything easier.

The curb effect is, you know. You see, this cuts in the street curves, you know you can use those in order to like push a cart or drag your suitcase or whatnot, and those were you know created to help people with wheelchairs. but we all benefit from that, and we all benefit from all kinds of accessibility. Things, even if we don't think so next slide please thank you So. why it's important especially in here well let's point out that in 2,019 an accessibility based product one, the you know an award here at the hackathon so wayfare is way more one, the best application for accessibility. as well so definitely keep that in mind. If you're interested in winning, you know, everybody wins when you use accessibility to win, then also accessibility, one the 2,020 mit reality hack. So that was also the best of Vr Award winner. And best application to accessibility, which was sponsored by Microsoft.

So okay, So perspectives on Xr accessibility, there are 2 different perspectives on Xar accessibility, the first making Core Xr Hardware platform software and embedded operating system level apps accessible. The second is a way to look at Xr accessibility as building Xr. Enabled assistive tech to improve access by people with disabilities in our industry collaborations.

We encourage inclusive co-design, involving people with disabilities to explore both of these dimensions, to advance the overall possibilities of Xr accessibility? No, nothing for us without us. All right. So here we have challenges in Xr for captions. accessibility. Features, like captions, are solved that are solved in twod, come into new challenges in 3 dimensional space. For example, where should the captions go on the screen how can you indicate who's talking? How do you handle occlusion? So in this clip, if you vacation simulator from alchemy labs, we see their take on it.

So next slide, please. okay. So challenges in mobility. immersive technology also raises new difficulties, especially for those with mobility disabilities. So actions that could have been done by simply pressing a button or using a joystick may not take a lot of physical motion. So here we have an image of a tool called Walk in Vr.

Driver that lets users with mobility in impairments participate more equitably in Vr experiences. Even action packed ones light is first person shooter. So basically, fellow here is saying, they gave him a thing, and all he had to do was, you know, rotate the camera with one hand, and it was the equivalent of his whole body turning which I think, is really great and so that's our brief timer on accessibility, and why it's important, and I am going to hand it back to Dylan.

Thanks, Ron, I think actually I will hand it off to miles Who's going to take this next part and spotlight all right, , , Is there any way to make him a little bit bigger? It's possible to make his image bigger for the interpreter to see or no, not really. I don't think so, unfortunately Sorry you can go ahead and teach galaxies, but, like one moment being interpreted home. This is Bill i'm dylan this is bill in view and speaker, side-by-side speaker mode.

You should be able to slide, make the video image bigger in the slides smaller. But you have to be in side-by-side. Speaker. Mode. let's see side by side. speaker mode and you have a little grab handle between miles this video and the slides. You can make it bigger. Thank you. That was very helpful. I got it. Okay, Interpreter is ready now. Sorry, Miles All right.

Thank you so much for your patience. Okay, we're talking about hacking for accessibility. I'm miles from a Nonprofit called Samoa space. Thank you so much, Dylan, for having me. I just want to make sure that the whole culture is accessible and inclusive for deaf and hard-appearing people. Okay, So let's see, I want to Explain why it's important to do design for accessibility.

Want to make sure that it's including people with disabilities. There's a good quote nothing about us it doesn't include us nothing about us without us. Explain what that means. research and design, development and testing all of those steps and project design and development really should include people with disabilities. The number. One thing the number one concern that I tend to hear from other developers is, they say, well, i'm not disabled, or I don't really know who it is in the disability community that I can reach out to and

connect with it's a start. it's important to start using that information and share it. So we're collaborating with people with disabilities and integrating people with disabilities more from the beginning. I've done some personal research on people with disabilities who are developers.

We do some design accessibility design, but it was really hard to find. So I want to make sure that they're including everyone in every step. Unfortunately, a lot of design does not involve or include people with disabilities like getting interpreters for this event. I really really wanted to join mit. I wanted to fly there and be there in person, but they really struggled to find interpreters. So that's just one example of inequality for a disabled person.

I happen to be a deaf person is disabled but that's something that we can use to learn and grow for next year, and maybe I could join in person next year. Hopefully. I'm Not sure it is travis all right travels? Are you on, hey? We can hear you're a little muffled, but we can hear you. How about now? that's much better yeah i'm gonna turn it over to Travis? Yeah, we'll switch interpreters are you still there Travis. I think Miles is ready for the next slide. We might have lost Travis again. Okay, we can go ahead. The interpreter is struggling with Tech today.

Yes, that's okay, So disability dongles that's the label for the slide. So what is that? Sometimes people mean to do the good thing they really want to help. Maybe they're learning from maybe they're learning engineering. or they're studying to become an engineer, and they want to create something like a project that will help deaf people. And they think, Oh, I want to make these gloves. The gloves will recognize sign language. Unfortunately, that technology doesn't really help.

And that happened because the person didn't really reach out to and talked to the deaf community. First they didn't ask do you need this they just kind of make an assumption that it'll be a helpful thing, and they go ahead and make it so. It's important first to ask the community What it is that they need? What is that they want? What access challenges do they have and then the deaf and disabled people can inform them to pay clear what would be a good thing. So it's cool tech but unfortunately they're they're bulky. They're not comfortable to sign with they also don't include facial expressions which are very important for sign language.

That's part of the grammar of the language So it doesn't include that it's just on the hands point again to be inclusive with people with disabilities every time before you start your project do a little research and reach out next slide. please. Okay, So another thing that would help when you're asking people questions. People with disabilities in the community for their feedback. Ask what what access challenges you actually have and i'll give you some examples here.

If people don't depend on sound then how can you help them identify and label sounds in their environment. For example, another challenge might be. If you want to include them in Vr. They want to participate in that. We have a group of people and they're chatting with each other at the same time. How does the person understand which one to follow can you separate the 2 conversations? if they're happening at the same time, That's another thing. Can we use? I recognition how to improve the caption experience or eye tracking and to clarify? Yeah, that is eye tracking that I meant related to sign language users. People who use sign language? Mvr: Is that possible? Can they sign our hardware and software to to support that avatars? Can they control their hands fully and have full digital manipulation control.

Also, if you can't support sign language? How can it be improved, how can we hand that technology? I'll show you some things pretty soon related to that too, and the next slide, please. If the person is blind, how can they navigate their environment? Maybe they could use sound feedback or vibration can be our support screen readers. That technology like where where would the words be would they be able to tell where it's happening in Br.

If they can't see we haven't and we haven't had that yet, and maybe a I could assist kind of expand on what the person is seeing. Do a description of what's in head of them you have a description of what the shape of the object is, or the light value things like that that might be helpful if a person uses mobility, assistant or maybe they're temporarily disabled. Maybe they have to be on bed. Rest. for some reason. how could they use vr if they're looking up? or if they can't move their head higher low?

And how could people with limited mobility try to control and interact with a virtual world in general? For myself, I've developed a prototype with different experiences in Vr. From the perspective of a deaf person. So I wanted to show you an example of that of captions. We want to really improve that experience. So we know where some of the challenges are.

If the captions are only in specific spot. if they follow where I look, how will it show who is talking, or where the sound is coming from? Because if the person is behind me and the sound is coming from behind me, how would it show that, for example, if we move the captions closer for people who are talking nearby? And then you look away. Is that accessibility? So I want to make more of a hybrid approach. So the captions themselves are smarter they know where i'm looking, and where the sound is coming from. Even if I turn my head or they'll sink with who's talking. So this prototype is an an unreal engine, and I enjoyed making that because it's it's good thing for other developers as Well, I think I have a couple of slides left. We can go to the next slide, please.

How do we show sound that's far away versus close to the to the user? One approach that I have tested and used is to use opacity, so you can tell if it's little more opaque. It's far away it's clearer to read if i'm close up. I prefer not to change the size of the fonts if possible. If it's really far away. I Think it's too small that's not accessible for me. And one more challenge with sign language i'm using the quest from Mataka that can support sign language. It can recognize some of it but it's a little bit limited, because if the hands cross over each other occlusion, that's what happened.

The occlusion happens so that is a struggle that I'm still dealing with it's because of the camera. I can't see both hands at the same time. so Maybe we need to improve the model of ai to recognize assume One hand is over the other, even if it can't see it. So these are all challenges that we want to improve It's really important to include deaf people in the development of Vr and Ar. Thank you so much. Think i'll turn it over now to the next presenter.

I'm going to be me, thank you don't are you controlling the deck. All right. So I have ex access. We do have a github rate, for that is, gathering all kinds of ongoing and live github rosaries for all different kind of engines and frameworks from unity on real engine and webex are yeah So it's like a one-stop location like a single source of truth. That's the idea that we have there and if you are working on accessibility features today, and you're submitting them as Github Great posts. I'd love to include them as well, just to make sure we give a very rich and diverse set of tools and thoughts that are going to making Xr accessible next slide. Please. So I am listening here a few things out that I think are good conversations, starters.

I have here a few open source code spaces that are focused around. Betterx are My field is lpx are i'm working mostly in a frame, and think around with wonderland engine. The idea here is that because the W. Threec. has done a lot of work in the accessible space already, and Ada compliance, and we echo point. All guidelines are being used as measures for accessibility in law cases and in lawsuits, and in in conflict cases.

I think it's very important to understand what has the twod web offered us already. What can we take as knowledge from that? And how can we make that transcendent into spatial space? And bepex. R. is basically that bridge where we bring browser-based virtual reality and augmented reality to life through the browser. And therefore we can use libraries and standards that already have been established.

But the W Threec. and So it's basically a match made in heaven here. So here's a few examples. The a frame is a entity component system framework to build web-based virtual reality and augmented reality, and it's built on top of Threejs and Threejs Bobby longev's threed libraries that are javascript based. There's react. 3 fiber which is a react-based, start, a start, a kid to create 3 js and projects as well.

Babylonia, as like I said this is another javascript library that's been used in some of I'm. other Vr. Experiences that also web-based. And then I also suggest you look into Mostola hubs, which is a virtual reality, shared virtual. A social, virtual reality tool where you can interact with other people, and it also builds on top of a frame and 3 Js.

So all of those frameworks here open source, and there can be tinkered with in a hackathon. Next slide. Please have here a few lists of already existing plugins in libraries for accessibility in Bepexa. I shamelessly plug here my a frame Gui the graphical user interface that I built based on a frame. It's a it's a framework that helps you jump start interfaces.

We graphically interface either interfaces and has aria built in, and also is tap enabled. There's Bay Area integration from ruben Vandalloin. He created a small, tiny github ripple there, where he is bringing in way aria into virtual elements, like entities in a frame. Then there is reactory. Fiber itself has a accessibility tool which is quite elaborate and pretty, pretty extensive, which I think is pretty amazing. They did a lot there. and then then there's the another smaller Github repo about an accessibility test which is, keep it keyboard, accessibility and tabability of elements that are living in the dom as just rendered as threed objects on a canvas object so it's basically it's Yeah.

So it's living in your ram but you can still navigate it. Pretty cool stuff next slide, please. Yeah. So when we call going to, we were talking about like Donald and hand tracking has invited a lot of people to think about and assign language, and just using the hands But like, we know we heard from from the previous presentation we need to include facial and body language. In order to make sense out of sign language, and also American sign language is different than other sign.

Languages like German French sign languages, are different so it's where it's It's there's not a one size fits all for this i'm showing you here a few libraries that enable tracking of hands in webex are browser based. Again, though you can get started with understanding what are the limitations of the actual tracking within. The Vr. Had sets. And where can you like? How can you bring it closer to some useful tool for people with disabilities? And where do you have to understand what the limitations are, and then find a lower and common denominator to serve everyone? Yeah. Another thing which is pretty powerful about bebexars, that you have a progressively enhanced and gracefully degradation kind of aspect to it. So you can work from your laptop screen and touch tablet a mobile phone and go then into a Vr.

Headset or an augmented reality headset so There's a there's a way to create one code base, and then add layers of complexity based on the availability of hardware. This is a really cool thing that you cannot really get with native native Vr and Ar. apps. They are living within one ecosystem and maximize the hardware on hand.

But with webex R you can actually play into the strength of the individuals device. So you can say, Hey, i'm using that experience in a twod way on my tablet, and I collaborate with someone in virtual reality. So you have 2 people collaborating on the same project. One is on a desktop or on a tablet, and the other person is in a Vr. Headset or augmented reality. That kind of phenomenon is pretty cool now.

There is a plugin here, a from super hands component that works towards that progressive enhancing from like a mouse click to a hand tracking and teleportation controls is another one that is about navigating vr space. I have here 2 screenshots. One has the the separation of gesture and response which goes into the epics or hand tracking tool where you basically have outline the limitations. And then, like the lowest common denominators of all hand tracking whether it's you make a fist like a palm up, palm down. It's not as precise as you get every single digit of the finger.

So you'll have limitations on technology. and you have to kind of figure out a way how to translate the few commands you have into a larger maximized set. And keep it. User: friendly. Yeah. next slide, please. And here are a few projects that I wanted to. Also highlight here and far has built out a frame city builder, which is based on, like he's building out cities. And I think that this would be a nice tool to start thinking about like environmental planning or city planning or you can actually think about creating a virtual world. And then how would you make accessibility?

Tools baked into that virtual world. and all this kind of kind of outside of the box thinking there's Google resonance. There's an aframe Component for that as well, that has been done by Marco Kgler. He basically made the Google resonance Javascript Library available in a frame as well. And then all we have a large, you, a Github Repo here from cowboy that talks about augmented railway on that base ofugmented reality which is market tracking, based location-based image recognition, natural feature tracking based, and all the other examples, a lot of code examples here as well.

Okay, next slide, please. Obviously, I don't want to exclude unity, because a lot of people work with identity here as well. There's definitely a bunch of components that focus on accessibility based on visual contrast color contrast and readability. Some are more focused around twod, but could be definitely applied to threed. And Yes, and and simulators simulators are a thing where there's a controversy behind it, so you can.

You can do your own homework with a simulator. Do a color contrast. test but it's always better to include actually people that have visual disabilities because there's a broader range. There, and you can only like test so much as long as you people with actual physical disabilities turn off and on at disability. They just have other ways to deal with things that you cannot even come up with. Next slide, please. Yeah. So don't do you want to take that over, or should I go? Yeah. I'm I'm happy to cover these parts thanks, Roland.

Yeah, thanks me. but light that's like so yeah just to to put in a few more examples of some really great products and prototypes. You look to that are solving some of these challenges of Xr accessibility. One that we we love to advertise is the walk-in vr driver which is a steam vr plugin you pretty much you can use it for any game. It just plugs in over steam that opens up a lot of really powerful options for people with mobility to disabilities, things like copiloting with an able-body person. Virtual motion and rotation. Position adjustment. so that the controls are, you know, higher than they are physically and things like tracking of disabled or spastic hands.

There's a lot of people with different kinds of motor disabilities, either folks that are wheelchair users, or folks that have kind of spastic disorders where they can't control their fine motor skills, very well but that Vr. would be really totally unusable for a lot of these games, except for the efforts of folks like the makers of walk and br driver. That have put some thought into this, but some thought into designing for those people. And thanks to that, we have these tools that can make those games, a lot of time makes a difference between something that's totally unusable, and something can have a blast in also want to point out a really great enduring piece of accessibility work. the seeing vr project this is a unity plug-in for visual accessibility. Unfortunately, it hasn't been kept, I think, up to date with the latest unity releases. So I don't know if you can just plug it right into your app. But This is a really research project out of Cornell and Microsoft that looked at tools like magnification edge, enhancement, text, augmentation, object, descriptions, and literally 10 other features for visually impaired vr users that could help to make make things more usable. So if you want to look for ideas about how to make your X, our applications were usable.

Take a look at this, take a look at the code, see what you can maybe put into your application, or see how you can apply these types of tools into other apps. You know these are kind of created as demos but that isn't to say that you could be a part of a fully functioning application. Also want to just throw out there's been a ton of these kind of cropping up in recent years, but 2 that are pretty recent. One is this ar captioning demo that uses augmented reality and voice. Detection wasist attacks to Caption people in real time There's also projects that just recently this one just recently came out of Microsoft called People Lens. That looks at basically using Ai and augmented reality to create a social support teaching tool for blind children.

You know, for a lot of blind children have a lot of trouble with with you know, making eye contact. That is very important. part of a lot of you know, social interactions with cited folks, but are very hard for folks that are blind from birth to understand And so creating tools like that that can help with specific accessibility challenges, is a really potent way of using Xr. Finally, if you few more examples, we have this conjure facial expression. Vr. Controller, unfortunately no longer of available, but looks at ways to control. Xr experiences with again alternate input methods in this case, you know. smirk left to swipe nose wrinkle to select things like that which, if you don't have use of your hands, Can be an incredible you

know accessibility use case. We also have things like the foot rudder, foot motion controllers for Vr. gaming. This is something that I believe is commercial aviation and you'll find a lot with accessibility that creating modular inputs and outputs like this, making sure that people have multiple ways to input into your system and multiple ways to get output out of your system, and that you're not bottlenecking on you have to use these controls in a certain way, or you can only see this, or you can only hear this making sure that your applications are modular is a really good way of making them more accessible, and then finally for resources wanted to make sure that everybody here knows we are going to be in the Mit reality hack discord in the accessibility channel for the next couple of days. So if you want to, you know, pick our brains on something, whether you're working on an application that is designed to be assistive technology that aimed directly at people with certain kinds of disabilities, or if you just want to make sure that whatever you make is accessible to as many people as possible definitely reach out Don't hesitate. You can get in touch with us. V discord and via some of the the links i'm gonna show in a minute. But before we wrap up and I want to give just a few minutes for Q and a because i'm sure folks have questions. And Roland. Maybe maybe you can swap your your computer around so we can see some of the folks in the room.

Second, therefore, all right, hey? Everybody i'm going to be repeating the questions when there are some. So anyone great, thank you So i'm at Microsoft? And this is an area that's very important to us right now, and I guess my question is, how how do we collaborate like a lot of the stuff that was posted here are like research projects by 1 3 people, and we need things baked in the platform like screen readers, abstracts, all of those, you know things that are that are necessary. So how do we take this beyond research and put it in core frameworks? Everyone do so that I can repeat the question Did you hear that now let's repeat it.

Yeah, the question is, Jared from Microsoft has asked, How can they actually make those research projects more real projects where they actually can bake it into the platform? And the systems. Yeah, better collaboration. I can answer that. But go ahead. they'll go ahead sure sure I think it's a It's a fantastic question, and I think it's really the basically de reason that Xr access was set up as an organization because you're right there's some of these amazing things like seeing Vr that awesome research project had all of these great features. But those didn't really actually make it into the end products that we've seen so far. And I think the the answer to that is that it is going to be hard. and it's going to take a lot of people working together.

But it it's gonna be a combination of making sure that you know the folks at Microsoft and at Meta and at Google, and any of the other platforms are integrating the stuff into their platforms as much as possible they're talking with folks like us and the xr association and open Xr to make sure that the standards that are across platforms will match up. Then it's also going to be the responsibility of individual content creators to, you know, to use this stuff. But for that we need to make sure that you know these plugins are as easy as possible to just like plug and play. Put it in your app boom it's accessible instead of having to rewrite it from scratch for every game or every application. That's one of the reasons that we started up that Xr accessibility project which i'll put in the chat here again.

But it's X-ray org so github capital, G. Capital H. If you have good accessibility tools that people can just plug in, put them there. And I think soon enough, hopefully, sooner rather than later we'll get Xr accessibility to the the level of maturity that we see for twod where there's kind of understood ways of formats of tools and plugins in order to make things accessible. but of course the fact that even a lot of twod things right now, aren't accessible, you know, screen readers work differently on different browsers.

We still have a long way to go on both ends. Yeah, I might add to this as well that we are always on this chicken and egg problem where the hardware is supposed to do the accessibility layer, and not the content and the content should be accessible, because it will then be tapped in by the hardware. So I I think that then creators, you have to be as nimble as creating contextual content, and have small pieces available as soon as the hardware is mature enough to take it up. But in the meantime he creates some kind of meta layer and software to enable what the hardware layer on should be doing.

So it basically the content should be thought out as being an inclusive and accessible content. And then the hardware I will follow eventually. So from Microsoft they had collaborations with a lot of the our headets, and I have this Hololens, the But Microsoft has also the accessible Controller that you can Plug in that that one is actually something would be a nice tool to bring into a Vr. Environment as well. Right? So you can input map like I showed for before. Next question, please. I think probably the last one, because we are already pushing the time limit here. Unfortunately. yeah, go ahead. kind of a of our 300 you know, are off people like.

And that's a long time creator was yeah yeah so what's your name? Pages, just asked where, as a creator, a concentrator should he start from? Because everything that we've shown is quite fragmented everyone is basically inventing the wheel. No, I know. So who wants to take that answer? Go for it, Roll. And I think you know about the fragmentation is my one. Yeah. So this way, this way we created the github repo where we basically can show what's out there and curates curate a list of all the thought processes that went into solving certain problems.

There is from the W. Threec. a bunch of standards that have been already started. Dylan, Do you have some of those documents? I think Bill just put a great link to it in the chat. We also list some of them on the accessibility project. Github. Yeah, on our website, Xrxs org, you can find a lot of content as well that all focuses on like normalization. Standardization like coming to a kind of normalized set of demands or needs needs that we need to solve.

And then it's up to the creatives to solve those for those, and I'm, assuming at some point the best solution will take over and win this space. But in the midst of, it, it's an open the Wild West, and I also want just want to read off what Miles put in chat here. We need to see orgs and tech companies working together to build an open source framework for accessibility, something like the Xr access.

Github is a good starting point. There needs to be cross-platform plugins that address and solve many of the common access problems that can be dropped into a project by content creators. But of creating This should be informed by and created with people with disabilities, which means hiring and consulting with people, with disabilities at every step. And also making sure that we have our accommodations worked out for for things like this very hackathon, so that people with disabilities don't feel like they have to to struggle and argue just for the the right to be here with able bodied folks. also. access to cloud computing resources is a very real barrier.

I would like to release my unreal engine captioning design work to the developer community, but I can't afford to cover the cost of ai cloud captioning So that's a really good point so we are we are just about over time here. So I think i'm gonna have to call the open Q and a there. But I find us on discord. one another quick announcement. If you want to learn more about Xr access and what we've been doing, you can join us at our spring showcase, which is tomorrow, March the 20, fourth, 3 to 5 P. M. Eastern. you can learn about the workstream sort of a better research network. Learn about our annual symposium and networked with a whole lot of folks that are involved in this space content creators disability advocates

accessibility Experts and more experts not exports, but typo. So you can find that at x or access org slash 2,022 spring showcase, and i'm sure some of the folks there would be very interested to hear what y'all are hacking on over the next few days so also make a quick announcement that we wrote a companion article for this very event. You can find that article. it's called a Hackers Guide to Xr Accessibility at bit ly slash Xr dash a 1. one y hackers and that has a lot of the content reviewed here today things like defining Xr accessibility, user needs and hack suggestions and the hacking resources that we talked about. And finally, if you want to find a on Xr access, you can reach our website at Xr access Org can find our email at Info at extra access org join our slack.

We really encourage everyone to join the slack community and make sure that you're You're have part being a part of these discussions at Bit dot ly extra access slack and also you can join our newsletter our linkedin or twitter at the the links on screen here. All right.

2022-04-05 12:54

Show Video

Other news