IMMERSED IN: Fencing—Biofeedback & immersive simulations for sports training

IMMERSED IN: Fencing—Biofeedback & immersive simulations for sports training

Show Video

good morning good evening or good afternoon depending on where you're joining us in the world today welcome to our our hybrid our live in studio and virtual broadcast from the immersion lab at mit nano this is immersed our monthly series highlighting how immersive technologies whether it be motion capture or ar and vr augmented reality and virtual reality can be used to invigorate and bring together opportunities in science and engineering and art today we're joined by robert hubb he is the assistant fencing coach at mit along with talus rex and pranith numbury will be sharing with us today how what what are some of the opportunities to potentially use biofeedback and immersive simulation for sports training this is enabled by the immersion lab the immersion lab is a central facility at mit a shared resource for you so as you're listening to today and the exciting thoughts that rob has in terms of bringing technology to coaching look at the technologies and imagine how it can be used for your sporting endeavor your science endeavor your engineering endeavor um please ask questions as we go on the chats we'll try to answer them in real time but enough of the logistics um i'd like to turn it over to rob and pranith and talis uh i'll put my mask back on and uh gentlemen take it away thank you brian welcome everyone i'm pranith and here we have rob and uh we're gonna have this in the style of a conversation for a little bit so uh rob why don't you tell us why you as a sports coach wanted to come work with the immersion lab well uh first i watched a demo of the capability for another sport and that got me interested because i had been thinking along some similar lines on my own uh along with head coach yara konyoush one of the things that as coaches we can measure some of the ancillaries for athletic performance we can look at a fencer's record if touch is scored in certain situations wins versus losses against certain kinds of opponents we can measure athleticism in terms of strength and speed but there are some things we can't see and some evaluations we can't make and it would be nice to have some data behind it other than our own gut reaction and if we're good coaches our gut reaction is pretty good but there are some discrepancies in what we train sometimes and how we see our fencers use what we train and having data to support maybe some new training modalities that's what i'm looking for nice thank you rob so my next question to you is how can fencing benefit from biological measurements and feedback i know you already touched on that a little bit yes um well you can measure motion of the weapon motion of parts of the body of the fencer measure uh general muscle tension immediate contraction we'd like to look at force applied to the ground and now a lot of this has been done but it's been done in sort of a petri dish mentality as opposed to in situ and it's also not been integrated you know they're studies of of speed of the point speed of the core plyometric studies for explosiveness so perhaps like an integrative theory of coaching and something that's verified and backed up by data yes and the data that's each piece is linked to another piece uh we'd like to have it in real time yeah and that's something that we can get these component measurements and plenty of studies have been done but having them integrated and having them what i would like to call phase locked there's something that's not been done and now we have the technology that we can phase lock these measurements right so different types of measurements the video the motion capture the force and myography information phase lock all of them see what uh what's happening inside the body so to speak when we see what's happening from the outside yes cool thank you so over the past few weeks and months we've been talking about a strategy for starting a collaboration and the strategy that we came up with was to capture the movements of more experienced fences and use the captured movements to train less experienced fences and finally monitor learning in sensors and so today's uh overview overview for today's talk is going to be this is going to introduce immersive technologies if you're not already familiar with them and then we'll talk a little bit about our approach to sports training and we'll do a live demonstration of a biofeedback system towards the end so for right now i'm going to turn it over to this will introduce you to some of the technologies that we are going to use in this project hello everybody so i'm going to explain a little bit about the immersive technologies that we have incorporated into this project to really make all of this happen uh there's quite a bit of pieces in here so i'm going to be talking a little bit about the hardware as well as the software that's kind of involved to for this project so uh first first runner up is the motion tracking system so we use our optitrack and combined with our motive software to allow to track these targets that we put on the body or the sword so i'm going to play a little bit of a video here where you can actually see how these patterns are expressed in a 3d world and with the system we're actually able to track every single point that we mark and get a lot of data based on how are they how they are moving with six degrees of freedom within this kind of uh you know the physical world so on the left here you see these traces so these are how the targets are actually moving within space over time and on the right here you can actually see the points on the body and on the sword where we can actually target them and pull out that data the next is a 360 camera we use this for our virtual reality simulation and we can actually take and capture around 4k resolution image data and being able to transfer that into a virtual reality environment so on the left you just see the camera that we use this is the theta this is pretty simple to use and on the right you actually have a visualization of what they see in the virtual reality environment and so this isn't a developed virtual assets this is real world video so i can play a little bit here and you can implement user uis ui elements within the uh 3d world to actually add some more integration within the experience and we'll explain a little bit more about the data that we're able to extract using these videos as well a little later in the talk and for those who don't know i just want to kind of give you an overview of virtual reality and augmented reality so virtual reality is a complete virtual experience that fully immerses the user in a fully artificial digital environment you find a lot of these in kind of the consumer friendly uh or the consumer based products like oculus quest and htc vive and we actually use these uh hardware solutions for this project in itself so on the left you have that oculus quest 2. this involves the inside out tracking so you actually don't need external hubs to track you you actually have cameras on the inside of the headset that are essentially designate your playing area and on the right you have the htc vive pro which has external cameras they have uh the controllers as well but what we actually utilize and we'll show a little bit later on is this vive pro has eye tracking and we really want to bring that out in this project as well given what we want to find when uh fencing what what they are looking at and maybe more importantly what they're not looking at and on the right is this augmented reality this overlays virtual objects on top of the real world environment so you're not restricted in that sense where you are completely lost in the real world you still have an overlay so you can still see where you are physically within the world and for this we use the hololens 2 and you'll see again more examples and we'll we'll be doing a live demo with the hololens 2 later on in the series as well and just some of the software solutions that we ended up doing for this project the game engine that we use for the immersive simulations is unity this is a pretty well-known application platform that allows you to bring all of these virtual objects together and to build it in a way that oculus and htc vive or the like would be able to render and you could fully experience within this platform blender is a 3d modeling software very powerful to use it's free and it allows you to really uh build out complex 3d models that you can actually use for your simulations so a lot of the fencing the vr simulation training application that we built we had to build models within blender to add for a more immersive experience third here you see hollow light this is a software developer kit that we ended up implementing into unity which allows for remote rendering and this allows us to bring the motive data into unity and then then from unity into the hololens 2 without really skipping a step all in real time and then lastly we use python for our data analysis and so i wanted to give you a little bit of an overview of the technology before we kind of dive into how we use this technology for our approach in sports training so i'm going to bring it back to pranith and uh we'll take it from there thank you thank you tell us so i'm going to continue our my conversation with rob so rob i'm going to show share with everyone some of the data that we've collected so far and our approach okay all right so the first we captured the movements of experienced fencers and uh do you want to talk a little bit about your experience you when we did that well what we have here is our two former captains from the mit fencing team who have not trained a huge amount because of the the pandemic but what we're doing is monitoring the motion of the point the motion of the principal parts of the body but what you don't see underneath the uniform are some emg sensors and then there are also force sensors in the shoes and so we can measure what the impact is on the ground but we can also figure out which foot is pushing when which is releasing uh when look at the acceleration look at the track you could uh slow down the videos i don't know if that's helpful well right and studying it also details also speed it up and um you know when we get further in the the next phase the discussion we can talk about what we can do with that but yes we can slow it down and the other is and i mentioned this before all this data is as you can see in the graph all this is locked together so we don't have to sit there and guess what's happening when we can figure out what the what the arm is doing actually what the individual fingers are doing with the control of the weapon and we can look at this locked into the motion yeah we get it about a four millisecond time resolution so you can really slow it down and know exactly the time delays between how the different components of the body are moving with respect to each other right yeah yeah and one thing that that allows us to do that coaches can't see and if you've got i know any athlete in a competitive situation is going to be self-aware but not as self-aware they as they could be if they were in almost a meditative state we can now get information on things that the athletes themselves can't report back to us and things that we can't necessarily see because we can't see the muscle contract we can't feel the force on the ground we can intuit it based on our knowledge as coaches but we don't have the hard data right right so for today i'm going to focus on motion capture and me as somebody who has never fenced before when i look at this it looks very complicated i don't know where to start right so one strategy uh if i wanted to learn fencing was to ask experienced fencers to break down into simpler sequences so before i get into that i want to just show uh aaron and taylor here interacted for about uh five minutes i'm going to speed that up and just show you a little animation of the trajectory interactions so to me this looks extremely complicated and i don't know how to where i would start learning maybe an experienced eye could see much better it's like maybe taylor's tip is much further in compared to aaron's i don't know but what we wanted to do to help beginner fencers was first capture a small sequence of movements so here is uh aaron doing uh i don't know i think i called it a disengaged lunge i don't know if i use the technical terms but it's just a two to three second video and i just wanted to learn this sequence right so normally what would happen i would watch this video i would come back oh let me try to play with that and if i had a file or an epa or a weapon that whatever weapon i would try to use it in that manner but there's no uh i'm practicing out of memory basically so one thing i wanted to see was is there actually a difference in me practicing from memory and uh present and us developing all these technologies together so we want to know if there's actually a difference when i put myself in these different situations so we're going to talk about that a little bit so here's me uh practicing and i'm looking at the my spine two specific muscles erector spinae and latissimus dorsi as i move and i can monitor uh the trajectories of my shoulder elbow the foot core and the tip so whatever you're interested in you can put the marker there and if you put several markers on big muscles you can also understand the stretch contraction cycles this goes back to the plyometric loads that you're talking about right so to assess the impact of immersive training on learning we had three scenarios i was practicing after watching a video practicing while watching a video on a 2d screen and then i practice in vr in the third one i'm wearing a headset where i'm in immersed in it looks to me as if i am facing an opponent so and now i want to know if my reactions are different in each of these scenarios so if i switch to [Music] 3d trajectories over here so here is the data that we bring into a different software and these are the mar these are the different markers that we monitored and right now i'm plotting the tip of the 3d the 3d trajectory of the tip and i do this several times so and i can play it out and i can see what that looks like the orange is the saber and the all of all my movements even though there's variance from trial to trial it looks like they they sort of have a shape right now uh i don't i i'm none the wiser but i'm just putting myself in new situation where i practice in front of a 2d screen so that's this purple trajectories now i do this several times again put myself into the 2d screen the one thing at least to my eyes that pops out is that the starting position of the tip is very different when i practice from memory and when i'm seeing an opponent right and the movement at least to me seems a little bit tighter and finally i'm going to uh put myself in vr and see how that compares right there's less variance in the tip motion in vr yeah yeah uh yeah all of these are with respect to my my back foot right yeah so even though it's a little bit tighter in that on the 2d screen it still expands out a little bit in vr i was curious as to why that happened so instead of looking at the tip i wanted to look at the hilt so the sequence was disengaged large and when i see what the hilt is doing if you look over here this i think this is a nice perspective uh the these are the trajectories of the here how the hilt moves through space and then i don't see much of a disengage the when i practice in front of a 2d screen right and i in in this case where i'm practicing by myself uh i see i see somewhat of a disengage yeah but uh if the the tip is moving too far it'll take too long and in vr it looks like there's a nice balance so i'm somehow intuitively i've perceived a threat yeah and i'm disengaging and launching well this is kind of a spoiler award that we're going to get to when we get into the vr but one of the things that you can see here is because you're looking at the variation between the tip motion and the motion of the guard and motion of the arm is that tells you something about the information you're sending to your opponent when we look at this and we look at what we're doing we also want to be aware of what the opponent is perceiving and this is all data that the opponent is reacting to right exactly and this knowing what we're sending as a signal gives us a lot more power as a fencer in terms of what we want to give right do we want to give you know a signal or do we want to give noise right right right we want to we want to have some of the elements that are convincing them that we are making i'm going to make an attack but i won't i turn that into something else so my understanding that that's also that's part of it but also it might be that if we're giving a signal with our hand first we're giving a certain kind of information to our opponent and then the tip is coming later maybe for certain kinds of moves this is what we want right but for other movements we want to give the tip motion first right and that subtle difference right is something that we want the fencer to be able to get for themselves right and we can keep working on it as coaches but self-awareness in the fencer is really what we want right so to me as uh of course i had no idea i was doing any of this having looked at this data i'm more encouraged to like oh cool i didn't know i was doing that let me see what's happening in my body when i'm doing when i'm actually practicing sure see if i can catch it with my awareness in the moment so i have better control over my own movements and you and not only but you also have more options you know i mentioned that when we were discussing earlier let's say we have three three velocity components to hit our opponent and so we've got tip speed yeah we've got arm speed and we've got core speed and in right-of-way weapons one of two of the three weapons we want to initiate an action of threat we want to make our attack so our tip has to do something first right then beyond that it's up to us to decide what we're going to do with those three component velocities and this allows us to be very precise in the measurement right right right so this is the trajectory of the core uh i i don't know if there's an ideal trajectory i can't notice much of a difference but certainly we haven't analyzed the data all the way no no yeah this is very for people watching this is very raw stuff yeah you know we have not gone in to try and tease stuff out of this right yet uh pretty and rob just a quick question uh the voice from above here um is it true that um you're sort of trying to teach as well the dynamics of the weapon so that you can get sort of the point to sort of lag what you're what you're what you're doing in some sense is that do i want the point to lag um no i i guess like in the coaching is it you're you use the dynamics of the weapon to your advantage the fact that it's flexible and and okay yeah um you know there are certain kinds of actions since uh you know the the blade is flexible there's a mass at the end uh the exception of saber there's just the curve of the of the tip so there's no no switch at the end but yes you know you can you can work on the dynamics and you know that's that's something we would like to to study we've got the trajectory but we also can then measure the the muscle contraction that's leading to that uh you know prenatal is looking at the at the motion here but the thing i said before is we have a lot of data that we can overlay on this we can look at the force applied to the floor we can look at the muscle contraction in in pretty uh pretty minut detail uh and then one thing that talus brought in that we haven't touched on will touch on the end is we can look at where the fencer looks where they're gazing uh you know let's look ahead just a little bit once we have this data uh you know when we talk more about we are going to talk a little bit more about avatar creation yes okay then i'll bring i'll bring that up once we talk about avatar creation excellent but this is you know there's they're like two phases in the analysis here and what we're looking at is the first phase of the analysis we're going to do a little bit of a live we're going to do a little bit of a live demonstration so for this we're going to show you somebody in a motion tracking suit uh you're using uh a uh sabre so what you see sorry a file so what you see on here are points uh that are being tracked live on uh on the person that you're seeing in the background and what talus is going to do right now is he's going to show give real-time feedback of the position of the the points uh to the person who is fencing to look at this so that they can look at their own movements what you'll be seeing on the tv screen is the first person view of the person wearing the headset so you'll get to see what they are experiencing a little bit and then you'll also get to see what they're doing right next so right so those are the points that the person's wearing and they can see them in run each dot represents one of those markers and then yeah so as he moves around the tip yeah excellent that's good can you tell that you're the one moving i can i can't see the jab i know i know that tip is a little off screen but right now the point is to look at how your body is moving yeah yeah yeah so we can obviously fine-tune exactly what we show and the goal of this is really to filter out uh information that is not relevant to a specific type of training and only show information that is relevant and that i think really comes from the coaches and so here you would see the spine and here's the right arm and as he's projecting forward lunch with a lunch you can actually see that and so now that we have this data we can really add on some visualizations for the player to actually be able to see what they're doing and we can talk about range of motion uh distance and all that kind of stuff depending on what's important for the user alternatively i can bring it onto my face and let's say i am a beginner fencer and i want to just see the and an experienced fencer is wearing the suit and i want to see how specific points on their body are moving so as they move so go ahead make a move yeah i can try to study just the relationships between specific points uh is the video clear tell us yeah okay okay so these are a couple of ways in which we can use this augmented reality via feedback system and i'm going to switch back thank you for your brave participation all right so uh now talis is going to talk a little bit more about a few different systems so each of these systems has a different type of benefit uh the virtual real the immersive virtual reality system the main benefit is to give them a realistic give people learning a realistic experience so they can learn in under pressure or tension if you will and the the purpose of the augmented reality system is just to focus on a specific move just until you get it right and we've tested it a little bit with other sports and people who use it seem to really like it and uh talon is going to talk a little bit more about what he's going to do uh in this domain right so uh what you ended up all seeing was a 360 video capture of essentially taking the 360 video recording and assessing how they're just moving within the space but if you wanted to take that one step forward you can essentially build out a simulation for a little bit more of an active participation with the fencers so more or less it's not necessarily a video but it's a simulation that has been designed specifically to monitor biofeedback within fencing and so i'm going to show you just this little uh clip here that will actually this is done in virtual reality so what they're seeing is the virtual reality environment and no problem and essentially we can build out models and we can bring this out and actually develop a simulation training platform that will be pretty responsive with the with the person and the user within the experience and so within this video here we've essentially created a virtual reality simulation uh if it works um and what we'll ultimately see is eye tracking um just one second and so eye tracking is is very important in terms of uh this kind of analysis because we could really pinpoint exactly where they're looking and and why that would be important and maybe uh rob can discuss why we might want to monitor the eye tracking of a certain player depending on what they're looking at so this is kind of the the next step uh we've got the data we're creating uh an avatar if you will and we're feeding that avatar back in through the vr headset now we know what we're feeding in and it's it's time locked and then what we can do is put the same kind of monitoring devices on our fencer wearing the vr headset and we can study how they react we can see uh let's say we start with the vr the avatar moving at a moderate pace something they're comfortable with and we study what their reaction is and we log this then we can raise the tempo we can ultimately raise it up to approaching depending on the scenario we could raise it up to a world-class tempo and we want to do two things we want to study how our fencer reacts to this do they change as the tempo goes up of course they're going to are they going to be changing appropriately and then we also want to look at where they're looking we're going to gaze track on them uh we may find for certain athletes there's an ideal gaze track that they have a map if you will and as the tempo goes up under the stress that tracking may go down along with other physiological responses in certain high stress environments they're talking about stress inoculation and they do this to a certain degree in vr but we want to kind of push it a little bit more and as i say we know what's going in we know the phase locking for that and we know we can monitor we want to see how they respond that will help us then with this information how our athletes respond that's data that we as coaches can then modify their training figure out where their weaknesses and and move ahead from there but with this adjustable tempo this gives us some things that we can't get unless we could just basically you know unzip a can and just keep pulling out olympic level fencers uh and have our fence whose fence against absolutely yeah thank you um so if everyone can see my screen hopefully it's displaying now and this uh this has been developed a little bit more to be an active participation so hopefully if we get this working what you'll end up seeing is an avatar actually playing the part of a fencer uh we've developed this avatar actually pretty pretty cleverly we have a system called lens cloud and this is a photogametry system that allows you to do some scanning of a real person and we can get some really photorealistic textures and body captures and we can actually take that and put that into a virtual environment so instead of actually having to model the avatar themselves or ourselves we can actually just scan them and uh place them into these environments so i think we should be good now can everybody see the screen it's coming through perfect awesome so now i'm just going to press play here and we have a couple things going on we have the avatar in the center and then we have some visualize visualizations to the right here inside we have a playback speed so right now it is static but you'll ultimately be able to alter how fast the avatar is moving and right under there is a header called timers and what this is assessing is how long a person is looking at that place within the avatar so we have a head marker a sword marker you can't see them they're they're embedded into the avatar but this might give some some visualization cues for the player or even a coach and allow you to really assess what you're looking at during the time and on the left you'll be able to change the moves that we were discussing uh that you saw previously so if you wanted to do a little bit more of attack or defense we can really monitor that and change things on the fly and to really assess um how how they're how they're performing um and on top of that see if i'll go to the next slide here on top of that we actually don't need to even rig the character or formulate the actions of the character in the 3d software we can have our motive tracking system do that for us and so this is a pretty awesome video here where we are transferring the data that we got from that mode of tracking software into unreal engine uh unreal engine is is very similar to that of unity uh it's it's pretty uh pretty similar in that sense so we can actually uh take that and bring that right into unity and use these movements to assess and to bring and to uh control the avatar if you will in the vr simulations so if we wanted some real-time feedback or we wanted to really understand the inter the really small minute changes that someone who is fencing is doing with maybe their legs or their arms we can really capture that and formulate that into the virtual environment that could really uh be pretty pretty informative for for everybody uh some also also some things that we were thinking about were some haptic feedback as well a lot of individuals were thinking if you actually had an active uh simulation can you feel the response from that virtual avatar and so we're diving into kind of solutions to to resolve that so let's say the virtual avatar hits you in the leg would you actually be able to physically feel that that's something we really want to get into place as well as uh combine this with the the emg and the uh ekg to really understand the physiology behind these movements as well as the eye trackings so there could be all these different components of biofeedback to really be encapsulated into this experience and to really bring it out lively and for the training module so yeah so i'm going to bring it back to rob and we're kind of going to go through the discussion if you guys have any questions we would love to hear them um and yeah thank you all for joining i just wanted to make one comment about the haptic feedback suit um pranith was talking about one of the techniques used uh that he's gone through as a competitive dancer is that you sometimes will adjust the stance of another dancer and this will make you aware of your own position one of the things that we discovered scanning some of the sensors was some body alignment issues that we have been working on but show up very much in the 3d scanning process so one thing we might consider doing is with this the vr and the haptic feedback one thing we did not talk about is adjustable avatars where a fencer can see their own avatar and see how they're standing and then reach in and adjust that avatar and then feel the haptic feedback in the suit so then they use it's almost like oh i'm my own coach i can adjust this and that will give them more awareness what we're hoping is that you know many many athletes over time develop certain kinds of repetitive motion injuries or they're not aware of destructive tension in the muscles and we'd like to have them get in the feeling of oh i'm not doing this properly i'm i'm prone to injury or i'm coming back from an injury and the physical therapist has said do this all right now i have to know what that feels like when the therapist is not there i'm gonna come back in here and to join the join the conversation see me live and so rob thank you that was wonderful and talented oh thank you so i mean i think there's the aspect of the coaching yes um you know how do you envision sort of using this as a as the athlete like how do you you know it's a tool that gives you the quantified information to maybe make it a little bit more quantifiable for you to think of guidance well i think how difficult is it for the student to do that this person do it themselves well if they if we have access to this information we want them to have access to this information and uh you know the thing that we talked about where an athlete is watching the uh watching the avatar and then getting their own feedback of how they react in certain stress situations i think that's one of the main things we want them to do is we want them to get this stress inoculation and we want them to be aware of oh i start shifting my days when i shouldn't shift my face and what we're talking about in terms of fencing can be applied to anything i mean an example i gave is let's see but you've got somebody who's a batter and they're they're in a hot stream and you've recorded all their their mechanics and now they go into a slump and you record what's going on in the slump then you can compare this to sets of mechanics and you can give immediate feedback to the batter or you've got somebody who's really good with a free throw and then they start missing well maybe their gaze has shifted you know and it only takes that slight variation in the gaze under the pressure of competition to make someone miss so any of these techniques are not just fencing just to add to that a little bit so to summarize what our hope is is we could cover an observation gap approaches and an awareness gap for athletes yes an observation gap meaning we can provide tools whereby you can look at the same thing in very different ways and pass it out and then that will also help communicate with the athletes better and for the awareness gap all the tools we are saying is oh i think i'm doing something but i'm not i wasn't actually doing that my coach said it i think i'm doing uh why is my course repeating the same thing i've encountered this problem many a time in dancing and one of the exciting things just i mean i think you know these tools quite frankly have been around for a long time but only accessible to major league baseball or the nba it's sort of the sports that have the big dollars and sports or teams that have big dollars and certainly you know having a facility like this um in doing full motion capturing gives us a platform to engage with the university but also um the types of approaches are not that far away from being deployed on a few cell phones you can do motion capture quite well mobily so there's the the pristine premiere of what you can do in the immersion lab there were some questions along how does this get deployed sorry how does this get deployed and so the technologies are now available or at the cusp of being available so we can take the learnings and the approach and not just have to be in the immersion lab but actually be with your phones and your ipads out in the field or out on the in the arena of where you practice your sport can't see if we have any questions there um uh i'm gonna open up the chat so how are we doing in time good we'll take a few questions from the audience is this this is pretty cool is this monitoring technique on applicable to improve technique on playing musical instruments piano or keyboards could you repeat the question so the question was could these techniques be used for playing an instrument whether it be motion tracking for piano or um i think that was the piano um so the short answer is yes i'm not a piano player myself so i can't speak as a piano player but you know i guess any you can imagine these tools being used in some way right with various levels of success any time that you're trying to teach repetitive motion and to quantify the motion and how to how to move your body and whether that be happening at the scale of your spine or happening at the scale of your fingers the same technologies different form factors different resolutions that you would need those different things would certainly be be very relevant are you looking for high school routines to test on an interesting question um so i um so um so you send us an email um immersed at mit.edu yep yeah we'll put the side of the admin but certainly this the the immersion lab as rob now well knows is a shared and central facility on mit's campus but we don't just support the mit campus we're open to the community um it's a little bit easier to use us if you're proximal to us but we have tools for both data visualization and data processing that might also be applicable uh but yeah we want to support the community understand the needs figure out the things that are are the research questions as well that we can work to support and sponsor for our students and our and our investigators uh do you have a video of a session uh with an avatar that you can show so the question is do we have a video with an avatar that we can show that that's in practice in process so stay tuned on the immersed web page um so we'll continue to post assets and stories there so i think we're probably probably good on time to give people a little bit of time at the hour unless there's one one last question for nick that you want to pull up how can someone help or contribute to this project um how can somebody help and contribute to this project so certainly we have um reach out to us uh but if you're in particular if you're interested in fencing certainly reach out to rob we'd love to hear from more offensive coaches i understand you know rob has a i think a phenomenally forward-looking vision for how to do this with our student athletes um it's we need to impedance match to both the individual athletes but also to the individual coaches you know coaches are going to use this in different ways they have different styles and would love to have that community and support rob in engaging that conversation and rob's been very good in reaching out to the other coaches at mit and we started this if you think back uh so rob reached out to us just after our baseball session with mit coaches there so we would love to talk with additional sports and that recording is indeed available if you interested in the baseball um but so i think with that um another question oh two more minutes okay okay uh this is really great stuff what would be required to implement something like this in a club hardware software money yeah so how would you implement this in the club so hardware software money yes people um and depending on the fidelity of what you needed um you could do some of the motion capture with your mobile devices on tripods and there are portable versions of what we have here available um i think the and then certainly the the the younger generation is increasingly familiar from gaming and just what's available um we have the staff you tap into your students to the youth to know how to deploy the technologies but we're here to help as well we have what we have our use you're here to help can this be used uh to review the live competition footage after the fact yeah could we use this for live competition footage after the fact um so if it were recorded um if the if we had 3d capture during the competition um maybe turn over to rob would that be something that would be interesting to do i mean maybe we do a mock competition i think it would be easier i think you know we would have to we'd have to uh definitely try it in a mock competition you know maybe during a varsity practice or that kind of thing uh there is going to be a lot of data that comes out of that so you know data storage is going to be a big deal and there's going to be uh you know there's some periods where there there's motion back and forth and it's valuable but you know we might want to drop some of that i think the short answer is yes we could try to do that uh we would have to make sure that the sensors or the the markers that you saw were in a position where they're not going to be damaged by the impact of the weapon and also i think you might have noticed that parts of the guard are are blacked out and that is an infrared issue yes with modification i'm sure we could come up with something one of the things we're trying to do is order in some equipment that is dedicated to this fencing project right right and you bring up another good point that i'll use as this was a closing comment um so on the data side you know one of these i like to describe the immersion lab as being the data interface to mit nano and that means a couple different things it means having ar and vr capabilities to visualize data on big screen to visualize data capture motion capture but also as we collect data on many athletes and over or for many athletes on different events different sports and different types of modalities of doing this we you do expect big data to happen right just uh you can't do data anymore without it being big so there is that that data analytics piece there's the thing that you can do for the individual and then there's a thing that you can learn by looking at a population of individuals and that's where being able to collect this data with several teams whether it be at mit or our student athletes in high schools would be a very exciting opportunity for us as well in looking at the the next steps of the data analytics that come after this so please reach out to us yeah so please yeah we have immersion at mit.edu

and uh i want to thank you again rob wonderful thanks for reaching out very beginning and doing all this experimentation leading this conversation uh nathan talis always a pleasure uh friends please stay tuned for the next immersed um and uh have a good day thank you stay safe you

2021-10-05 20:13

Show Video

Other news