EmTech Next—Imaging Impact: What becomes possible when you can see below the surface

EmTech Next—Imaging Impact: What becomes possible when you can see below the surface

Show Video

welcome everyone to the inside track our session today is impact imaging exploring the cutting edge of ultrasound technology with brian anthony brian wears many hats at mit he's a principal research scientist whose academic homes are in the department of mechanical engineering and the institute for medical engineering and science ims as well as the clinical research center which focuses on health and wellness he's also the associate director of mit nano with a particular affinity for the immersion lab and the data side of nanotechnology and he's a faculty lead for the mec e alliance which builds connections with industry both for education and research purposes with that brian what can you tell us about the latest innovations in ultrasound technology siobhan thank you for the introduction and good morning good evening good afternoon everybody depending on where you're joining us in the world so i'm joining you virtually from two places at mit mit nano the immersion lab it's at mit central facility it's open to the mit community and outside users uh heavily instrumented that allows the human to be the specimen and instrument in the immersion lab is that instrument enabling immersive technologies and new modalities of interacting with data and the world and similarly the clinical research center at mit is another mit central facility again open to the mit community and outside users broadly supporting human subjects research at mit and in our collaborative partners industry and academia so together these central facilities allow us to be very innovative at mit in developing sensors developing methodologies to get better data so first i want to tell you three stories handheld large volume imaging ultrasound imaging in its conventional way when you're holding an ultrasound probe suffers if you will from variability it suffers from the variability associated with patient motion or doctor motion or if i want to track a tumor and object over time how do i know that the image that i acquired today and the image that i acquire a month from today in six months from today is at the same location on the body acquired in the same way and the answer is you don't so we're addressing the this problem of variability by augmenting the imaging process one example is we'll add cameras to the outside of the of the ultrasound probe and we solve the problem that google solves when they drive cars around through the world with one camera or with more to make it easier we will simultaneously map out the world where the world is the patient so the camera attached to the ultrasound probe is acquiring imagery of the skin or of things right below the skin and the ultrasound probe is acquiring these slices inside of the body we'll use these together to create dimensionally accurate volumes analogous to what you can see with mr or ct so some of the imagery that we're seeing here on the upper left is a conventional ultrasound image i believe this is a slice maybe taken through the liver the upper right and central image is that of the skin the center one is the micro release of the skin and the upper right is the vasculature these are seen with the camera on the left the ultrasound image is seen with the probe we take this combined imaging system so cameras and ultrasound probes and we slide them along the body mapping out the world and stitching together in a dimensionally accurate way these singular two-dimensional images are either properties or structure such that we can create a full whole organ volume of this case the liver uh this video shows you a little bit of the behind the scenes of how this data is acquired we're sliding along the body a probe can be scanned in a number of different ways we create this map of the world of the external aspects of the body and then stitch together in a dimensionally accurate way the ultrasound slices so that now we have a volume interpretation of what's going on in the body and not just a singular slice and dimensionally accurate volumes so taken together combining skin images ultrasound images uh the the algorithms that are used for mobile robotics um to create a dimensionally accurate whole organ view with a fundamentally safe imaging modality ultrasound to get structure analogous to what you would see in mr or ct one of the real exciting sort of switches if you will is that same technology that we use for that was motivated by solving a real problem of variability in medical imaging we can now apply to things like inspection of fuselage where an inspector may be going around looking at sections of a of an aircraft and wanting to know to map out are there defects are there cracks that are growing over time and to monitor those over time combining a handheld magnetic sensor with a camera that looks at the external structure we can create the volumes that give us a volume view below the surface so this way we acquire volumes in a handheld instrument another uh theme now let's stick with the theme for volume imaging and instead of doing it in a handheld way but let's stick with the low cost and very targeted use is looking at conventional ultrasound systems on a gantry system on a robotic system to again acquire volumes the application that motivates this research in particular is prosthetic fitting if you are wearing a shoe and walking you know that if the shoe does not fit well it doesn't take long before you have a blister and are very uncomfortable that problem is magnified by an order of 10. uh if you're an amputee and require a map a residual limb so how can we make it so that we make a perfect custom fit of a of the sock or the shoe that attaches to the subject here we've developed a system that is a low-cost reuse of a conventional ultrasound imager that we put into a water tank along with the patient's residual limb and we scan this camera or this this ultrasound probe around the limb but to make it low cost but we want to use a conventional system to make it comfortable we don't want to constrain the patient and so the patient may be moving we allow now an integration of an ultrasound imager and a camera we put a three-dimensional camera in the bottom of the tank that tracks the limb motion as it's moving around while the ultrasound probe is flying around so let me show you what that flying around looks like uh here is uh the the system in practice during a human subject study in the clinical research center at mit the ultrasound probe is flying around we see maybe if you look at the bottom of the tank you can see the camera looking up at tracking the subtle motions of the limb we combine all that data and we can construct volumes that are very analogous to the volumes that you would acquire with a much more expensive much more time-consuming imaging modality such as mr we take that imagery together and from that ultrasound data we can do two things we extract both the geometry of tissue of the external aspects of the tissue and of the bone interfaces but we're also able to extract the tissue properties the elasticities the stiffness of the tissue taking that data in collaboration with hugh hair for example at the media lab we then can design a custom socket that is 3d printed based on the properties of the patient we create a 3d socket that is stiff where it needs to be soft where it needs to be it's a perfect geometrical fit and that serves this purpose of making a custom low-cost way of developing this perfect match of the shoe or the sock to the amputee now that same technology this idea of scanning around a volume and taking that data and estimating the structure and properties we're also applying that same technology to imaging of pipes whether it be for oil delivery so the mix of water and oil but a low cost imager or in this case an array of images around the pipe can allow us to track in real time as stuff is flowing through the pipe the mix of water and fluid of particulate and again motivating originally by a medical application but the technology solving a real problem there and a need uh is now being repurposed looking at manufacturing technologies so the last quick story that i want to tell you uh turn it over to shawan after that for you for questions is in laser ultrasound or how we can image at a distance with sound for decades um laser ultrasound has been used for non-destructive testing meaning we can take light and use it to shine it on an object and create sound in that object so here we're looking at a silicon wafer the silicon wafer for example will undergo rapid thermal processing and during that processing temperature is changed and chemicals are flowing over the the silicon to change the the the properties of the silicon we would like to know the temperature of the silicon we'd like to know if there are defects in the silicon but we can't touch it and we're looking at the wafer through a window this is appropriate for silicon it's also appropriate for forming of metals or aluminum or steel i want to take now a non-contact way to interrogate with sound so here we use two lasers one laser is a pulsed laser so analogous if you turn your flashlight on and off real quick we take that pulsed laser we pass it through a conical lens and we form a ring and this ring of light is pulsating onto the surface of this case silicon locally where that light falls it very locally heats up the wafer heats up the material that heating creates a stress gradient that stress gradient then needs to propagate it propagates as sound so with light we've banged on the material surface and generated sound then we'd listen to that sound we take a continuous wave laser shine it at the center point of this ring and with an interferometric technique we can detect the vibrations of the skin surface we receive that sound wave with light not contacting purely with light and we can back out the elastic constants and the properties of the silicon or the defects in the material now the exciting thing is if you then pick an eye and skin safe laser you can do that same phenomena on people uh we had a paper out in nature two years ago now where we take two laser beams and shine them on the person's body scanning them across the body again relying on this effect a light going on and off will create a sound wave in the body we use a second laser listening to the sound at a distance so that we can shine that light scan it across the body light falls on the surface sound propagates into the tissue sound propagates back out to the surface and light then detects vibrations of the surface we scan those across the body and we can create a laser ultrasound image that was acquired two meters away from the person and here we're showing the back to the original aspects of the story that non-contact laser ultrasound image side to side with what you would acquire at the same location on the body with a handheld ultrasound probe so i just wanted to share with you sort of three little stories in imaging what's possible uniquely enabled by the central facilities that we have at mit both the mit nano immersion lab and the clinical research center and siobhan thank you for the introduction happy to take questions from you and and from the audience thank you brian fascinating um it's interesting to see the evolution from the conventional contact ultrasound technology where you're augmenting and automating to some degree to get um i guess improving not only diagnostic capacity but also improving workflow i would imagine um and then you know breaking the paradigm to going to non-contact laser ultrasound um what do you see is perhaps the most compelling or a couple key applications of the non-contact laser ultrasound i think there are many um you know certainly as we're working to address the issues of very variation in ultrasound imaging eliminating the contact eliminates the patient or the doctor motion and the variable pressure that may happen between the patient and the doctor but another very exciting area is if you would like to image in a fundamentally safe way so not ionizing not radiation um it's a tissue that's been damaged whether it be burned or tissue that you can't really easily access during a surgical procedure uh so being able to integrate integrate um either on the battlefield or for a burn victim or in the surgical suite sort of imaging modalities that can be that third arm to image the patient internally i think are several of the exciting areas that i think will be uniquely uh enabled by these technologies right that sounds promising indeed um so a question from the audience um regarding the robotic um use of ultrasound technology in the in the water tank for prosthetic modeling um does the water tank help with getting more angles um what's the what's the advantage there yeah so the the benefit of the water tank with conventional ultrasound you know whether when yeah we're imaging with uh megahertz of frequencies that don't propagate through the air um so if you're using conventional ultrasound you need to use gel to make a contact between the probe and the patient uh here the water is serving as the basis between the contact between probe and patient but the other thing the water is doing is eliminating the need to contact and locally deforming the tissue because what we would like is an undistorted and unperturbed image of the geometry of the external um tissue of the of the residual limb the distances between that external tissue and the bone and if you were making contact uh then you you naturally introduce aberrations and defects that are possible to correct for but it's far easier um to make a robust system that doesn't ever need to deal with those those issues of error right um and then with the laser um non-contact ultrasound i mean you've been you've been working with lasers for most of your career and one of our questions from shannon is can you describe more about how you got here um maybe what worked what didn't how iteration based on personal fit improvements so forth and so so on like what what's your trajectory been and what is what have you seen maybe along the way that's really surprised you about the technology as it advances technology in particular for ultrasound broadly or laser ultrasound in particular um well maybe laser i guess okay well i think one of the um lasers have been used for such imaging for for decades in non-destructive testing um and and one of the exciting things that's happening now and this was done in collaboration with folks at lincoln labs and and on campus here um is the the the miniaturization of optics in light so the same way that we're able to miniaturize electronics to make our cell phones and our earbuds integrated photonics is using the same technology stack the same design tools the same fabrication technologies the same materials you take big lasers big detectors and put them on small chips that need or the technique the need that's motivating those optical integrated photonics innovations are things like the data center light is a much faster low lossy way to communicate across distributed cores but also self-driving cars so lidar as one motivating use so those are our big market technologies or needs that are that are driving the innovations in integrated photonics that are small small receivers detectors transmitters uh ways of propagating light on substrates that give us the feasibility of taking these systems that have historically been only realizable in the factory or in a big lab and actually putting them into a handheld device or a device that easily fits in the surgical suite and how far along are some of these technologies so with the the laser non-contact ultrasound and also the um for imaging for prosthetic limbs matt asks you know is it already on route to organizations um when is one of these things going to market and when can we take advantage of them yeah um it's actually it's a big question and it's a long answer it depends on what we're trying to do um generally speaking i think many at mit will will quote the fact that it takes 15 to 20 years for an innovation to get out of the lab into the marketplace um we're benefiting though from a long history in in ultrasound and these and new innovations that are happening um and so for things like the prosthetic fitting which is a a novel reconstruction of conventional imaging that that's commercially um ready now it's not deployed yet but it's commercially ready it's not the imaging tools are are don't need to be created the the fitting process uh and and there is a a company i know that's spun out of the media lab uh that's looking to to license both the imaging techniques and the the modeling techniques that allow us to take that image data and create the fit the real innovation that needs to happen for the prosthetic fitting is is the detailed understanding of what are the appropriate modifications once you have a model of the limb do you exactly want a conformal fit or where do you need it to be tight where do you need to be snug and so those are some of the things that um need to really be used to get that aspect out into the world but the imaging solution is available now that that's a thing that if there were a market in a demand it's low cost enough that it's achievable now on the laser side i think to make it fully deployable there there needs yet to be um some enhancements in terms of both the miniaturization of uh for for large deployment there needs to be miniature further miniaturization of the laser sources and the detectors uh but for things that if there were a high value use like in the surgical suite or for military applications um it's not there yet but i think the time is probably ten years away five years away and a little less than 20. um and it makes me think about the just sort of you mentioned industry and and some of the collaborations that that need to happen and in your research you have uh industrial partners and you have partners in clinical settings in hospitals and then you know you bring those together with the the mit research aspect can you talk a bit about how it is such a collaborative um venture i think that's i mean one of the exciting things about working closely with both mit nano and the immersion lab and the clinical research center is those are central facilities that are available to the community and that community is both the on-campus community and external partners that may want to come in through research collaboration work with us or for their solving their own independent needs um in the healthcare region in the healthcare environment at in around boston um we're blessed with uh density of mass general hospital brigham williams hospital children's hospital children's hospital where it's a five minute walk away and we can work with the clinical collaborators to understand the needs and the motivations and to get into the clinical environment and for the things that aren't appropriate to put into the hospital we have the clinical research center that allows us to reconfigure and have either home-like or hospital-like environments where we can pilot technologies on campus bringing the doctors to us and then in the immersion lab it gives us this full instrument the room that allows us to do motion capture of objects in a people and to just display the data at the human scale and to interact with it and understand it in a way that's far more organic than staring at your screen either in front of the little device or on the big screen so um along with an answer to your question uh but it's the ecosystem around boston we're blessed um and it's very fortunate to be able to to to work with external partners that know how to translate technology and that come with both challenges and opportunities yeah it sounds like a nice collaborative nexus um i was interested in your in the the nature paper um which the title is full non-contact laser ultrasound first human data um the first human data part being being key um you know that that you had to go by this committee and and it's it's such a big step is it now that you've gone past that step is it does that sort of smooth the way for the advancements to come in bringing that technology forward or is it sort of a continuing you have to go back and get approval for protocols again and again well certainly for um i guess just a couple aspects to that question um anytime you're trying to innovate in in medicine you need evidence whether you're you're an academic researcher or a commercial researcher you need to collect data you need to publish you need to demonstrate how the technology works making that open and available to the world so that's an important aspect whether you're an industrial or an academic researcher but on the when you're doing human subjects work you want to make sure that you're protecting the privacy of the volunteers or the patients that are enrolled in the study uh you need to make sure that things are fundamentally safe and so cui's at mit that's our institutional review board will review protocols or things that you want to do and make sure that what you're doing is safe you know they'll say hey you know let's make sure that the laser energy that you're delivering truly is iron skin safe and there may be a paper analyses that need to be done or experiments that need to be done to verify that that's working and the data you're collecting are you potentially revealing anything about human about the subject that should not be made available that's it's confidential and private information so those types of protocols are important to ensure the safety um of the of the process of the of the subject involved the the validity and veracity of what you're doing um to collect that body of evidence that that demonstrates that hey this thing doesn't just work on one person but in multiple so it's always you need to continue to do your research you need in the commercial uh clinical space in medical technologies whether it be devices or algorithms or pharmaceuticals you will always be collecting more data to demonstrate that what you're doing is safe and effective and you know things like the institutional review board help to keep that running smoothly right and what are you most excited about going forward what's next on your agenda well if you couldn't tell from my my talk i very much like this sort of interplay between manufacturing on one side and medical on the other and one of the really exciting things about the certainly the collaborations between the immersion lab and the clinical research center is the the way to interact with data i will i will describe the immersion lab as the data interface to nano technology it allows us to take big data big volumes of data that are at the scale of mars for example and and bring them down to the human scale whether that be on on headsets uh or in in wall-mounted displays but to interact with that data the way that we interact with objects so we move around we gesture um and you don't have to just point and click but you naturally can stream through that data whether it's big data big volume data and made it to human scale or small data microscopy data medical imaging data that you blow it up to the human scale but it gives me the opportunity if i can capture how a person moves around and just enter and put them into the data i can interact with that synthetic world the same way that i'm gaming you know same ar and vr artificial reality or augmented reality and virtual reality tools for gaming give us a new way to to do research and to understand our data and to explore um so i think it's that intersection of using these tools these gaming technologies um in our research and education enabling things like innovations in manufacturing and innovations in medicine are where i sort of spend a lot of time doing work um and are you constantly on the lookout for sort of these cross-pollinating applications are you surprised when they happen or how do those unfold generally um i guess i'm not surprised but i i do go out looking for them but you don't need to look that far um you know opportunities like this to speak to a diverse community you know keep your ears and eyes open and and your fingers uh out and open you know it's the cross pollinations uh you know team science of yeah it's mixing a mechanical engineer with a computer scientist with a a doctor where say hey have you tried this in this application um where i think the real innovations um in the modern era uh are enabled it's that interplay between mechanical engineering and manufacturing and and information technology iot big data you know these are the places where we're able to leverage that that combined multiple discipline space fascinating um thank you everyone for joining us and um we'll have you back on thursday to discuss ambient monitoring so i look forward to that it's very good thank you siobhan thank you everybody bye

2021-07-08 10:01

Show Video

Other news