Duration 33:53
16+
Play
Video

Workflow Integration in AI in Healthcare By Matthew DiDonato, Director, GE Healthcare

Matthew DiDonato
Senior Staff Product Manager - Artificial Intelligence at GE Healthcare
  • Video
  • Table of contents
  • Video
Request Q&A
Video
Workflow Integration in AI in Healthcare By Matthew DiDonato, Director, GE Healthcare
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
38
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

Sounds great. Thanks for the introduction in apologies to everybody for the audio mishap here. So I stated might my name is Matthew did not do I lead the product development team within GE Healthcare that is building out in AI platform to accelerate AI development within GE and that means that myself and my team work across all of ge's business segments to develop AI solutions that accelerate RMR x-ray CT all of the Imaging devices as well as all of the other sort of devices in areas that you play so it's an exciting

place to work and I love to also highlights from some of the great work that I think we've done and I appreciate the opportunity to speak here as well as I got highlights art of the exciting things going on with AI in healthcare in general today. I'm going to focus in on the Dynamics of building an AI within Healthcare, but mostly actually around the importance of workflow integration and actually having applications be consumed in a healthcare setting. So just start.

I really want to start by saying it's an exciting time in healthcare. It's clear given the number of talks about Healthcare within the conference today overall. There's a ton of opportunity for Health Care to expand its Horizons and I think a lot of that is centered around a i n Wallace true that sometimes Healthcare can be behind other Industries. Sometimes it's for good reason regulatory processes ensure that the solution that we build are tested and validated and the reality is especially

when we're talking about clinical settings. We're dealing with critical treatment decisions. And so all team used to be, However the fact that sometimes Healthcare lags behind other Industries also creates a ton of opportunity to fill that Gap that other unregulated Industries episode of Already Taken full advantage of from a technology perspective access to computer has never been better in and that's true on the edge as well as now with the ability to leverage scalable compute in the cloud really opens up.

Opportunities for Technologies, it paves the way for traditional machine learning the events advances we now have with deep learning as well as virtual care in general. There are a ton of opportunities in technology that we can now deploy in healthcare. Certainly that aided by the fact that the amount of data is exploding and that's thanks to an increased amount of Imaging new Imaging techniques the consumer devices that we all have at our home in and are wearing phones watches. The

list keeps growing and I seem to always be dying and I know that others are as well. So the data outside of just a clinical contacts that could be related Health Care is exploding this true even genomic data. If you were going to the Human Genome Project took 13 years, and now we're at a point where we're down to a couple hours after, you know ordering a test on the internet. So the data is exploding and With covid-19 11 more familiar with what that means at the population level sort of population health and measures and the reality

of how they impact us both on the macro scale in on the micro individual scale on so that combination of technology and that can be enhanced by data really allows us to move towards an approach that is more proactive than the system that historically has been more reactive to take preemptive measures ensuring that we could potentially avoid the most severe consequences critical conditions and subsequently potentially reduce costs by avoiding costly treatments in in

some of those cases being reflected in conversations about fee-for-service vs. EBay scare payments in soap. Overall, like I said, it's an exciting time to be in healthcare over the coming years as we Implement all of these things patient outcomes Healthcare overall will increase and instill will the the cost associated with it. So the opportunities really do seem endless. it if you look at Things from a different lens. I think it's always interesting to look to see where an industry is going

by how money is being spent in certainly all of this was true before more. So now with the acceleration of towels virtual Karen and in the general increasing importance of innovative digital solutions to keep us safe overall, when we specifically looking at a i in that lends there's a pretty clear consensus that there's potential with over 80% of healthcare leaders saying that they're they see real growth or the ability to use AI to create savings and and so like there

is the Confluence of not only the ability to sort of do this from a technical perspective, but real business value inn and those two things driven together are recipe for real change if you look, the second half of this the reason that there's money being spent here is that there's real Value Inn and takeaways last year alone 40 billion dollars were spent funding startups in new companies in the AI in the combination and complements of AI in healthcare

and when we look at where that's being spent there's a couple Key places if we look at it just for my clinical and there's some really, you know, right opportunities in and that's everything from patient triaging so quickly determining who needs immediate care and and what I care is on things like decision-support to navigating the complexities of comorbidities and keeping up the latest research out there things like assistance in and generally assistance around the interpretation of

results and I can spend everything from Radiology results from Imaging equipment Vital sign monitor, slap tester. There's a number of places were interpretation help can be useful and and then there's a big bucket around structure to reports and Reporting in general. This is actually an area where we start to create a virtuous cycle where did input is time-consuming and can be a pet? And certainly I can help there as we do that. We actually start to build a corpus or a set of data that is really structured and

becomes the backbone and the basis for even more AI models and so we can create a virtuous cycle with that last one. But the point is we look at things from a clinical context. There's a ton of places where a I can help in it certainly sounds great and it seems simple like okay AI is a new technology and we have data and the computational power to provide it. Let's go forward a minute. It seems quite simple. I think what's interesting is that actually that all there's clearly promise for a AI.

The reality is especially when we start to talk about adoption clinical context in a clinical reality is is complexed clinical workflow is involved a number of different individuals with different skill-sets. I'm an expertise that that spanned a number of different software systems and data silos for those that are familiar with some healthcare acronyms date is in the EMR the risk the pack the BNA it's it's an individual medical devices Banning everything from Vital sign monitors on a patient all the way to

large Imaging equipment. When you add on top of that that these systems differ between hospitals and geographies life gets even more complex. And so when we think about building a i Solutions in deploying them, we can't remove ourselves from the complexity of these work clothes. Because I'm from g e I I want I want to talk about how we're attacking injecting AI into this complex world of clinical work clothes and I want to talk about it from an Imaging workflow contacts. And if you are a

clinician. You might have already picked that up from the graphic here, but I really do think that a lot of these Concepts generalize well and clinical work clothes with or without Imaging or complex in so everything will be Imaging Focus, but but I think that this can be applied many other places. If we believe that clinical workflow circumplex, what can we do about that? I think the way we like to think about it is looking at what is the greatest way to actually have this impact and

end-user in the reality is is change is difficult and it's due to a number of different factors. Certainly there can be significant consequences to changing how we clinically do things on top of that. There's cultural and Nursery in that can be strong additional systems screens applications deviations from what people are used to provide significant real difficulties to getting new Solutions adopted because in the near from they can really slow people down. So I think the question we like to ask ourselves is how can

we make? AI invisible, how can we meet our users in their existing were closed? I think Andrew Inge from Stanford actually has a great thing around. This is a is the new electricity and the new electricity. Let's think about electricity for second. When's the last time you thought about having enough power to plug in your next device or where the power was coming from or changing your Life around where plugs are at the new electricity. Why should we ask clinicians to actively thinking about using an

AI tools or change their workflows to make sure that they can Incorporated in we need to find ways to make AI that works as seamlessly as electricity does in our current lives. I meant to do that. We need to think not only a models but how they're deployed. Where does the compute happen in? What I'd like to do is sort of come back to this workflow in and talk about how we're doing that within GE to provide some examples. What's start with ordering of the skin? In fact, let's let's go before or entry and

actually before the patient even arrives. Let's talk about a real pain point in and around no ships and so before skins even ordered. She's working on reducing the number of no-shows for people that aren't familiar with the term. No shows are a very has a very specific meaning. So it's somebody that is scheduled to be imaged but they don't actually show up and this can account for millions of lost Revenue annually. And so taking into account that this is a real pain Point around utilization of Imaging devices

G is were on pulling information then actually predicting when somebody is going to be a new show and we've got a solution smart scheduling that is based off of pulling demographic and patient information that could be in the wrist on his EMR on the actual scheduling information as well. Information about the environment we take into account actually the weather forecast for 5 days and there's probably some joke about do you know any I helping us to be able to extend

those forecast be on 5 days, but the ideas those are for external factors are big as well. And we what we do is actually predict her patient their likelihood of showing up and so we take this base model. Bring it to a site and then what we actually do is fine-tune it within that side use the local weather information use the local information about how patients have historically turn up until we end up with a model that is very specific to this site in a solution that is very specific to the site that using it. And then what

we do is produced per patient a likelihood percent of them showing up and so some patient may be 88% that they're going to show up in the end. That's pretty darn good but another patient may have a 50 15% likelihood that they're going to show up and when we do that we can then have the existing solutions that site has maybe text messages phone calls be used to fit into their allow patient allow sites to use those existing more clothes and stripe patients and increase the likelihood that they're going

to show up. One way we're sort of attacking a problem within Healthcare within their existing workflow. So using those same systems, but allowing our solution of stripe patients to hook in those regular mechanisms to make sure somebody shows up. If we walk along that clinical workflow away. Another thing we're doing is intelligent part of calling. So this is a solution that we have with our CT and MRI stand for those that aren't familiar with her calling. The idea is setting up the specifics around how an image would be captured

and intelligent protocol manager. I can for sure is a cloud-based particle manager that allows us to take the information about an order pull that also understand what a sight has from a porta calling standpoint. What devices are available and what they historically use as settings to capture an image. We then match those two things together. And then when a technologist is going to acquire an image suggest the right protocol, but even more interesting than that is

if that is changed if a suggested protocol is incorrect or we change something based off of something else that a patient needs. We collect that information and feed it back. And so we're meeting the technologist within their existing workflow and suggesting something new. So it really is a seamless solution and it improves and we collect the information of how things were changed based off of the suggestion that we provided. So again, I think this is a great example of how we can

meet Healthcare problems in line with where the problem actually is without deviating from the workflow another one before skin actually happens that I want to call out is something I'll probably call Eric's and this is an MRI product really an amazing solution in and what are X does if people aren't familiar with on Mr. So MRI Acquisitions. Usually what happens is a patient is laying in the MRI and based off of their positioning a quick low resolution

scan will be done to Define actually the coordinates of the image you want to take it and so it's based off of sort of the patients position within the MRI. And this requires technologist training and subsequently can actually produce some variability based off of different technologist because they are then making a decision as to where the acquisition plan will be the weather in images actually acquired based off of that low resolution scan in in their own judgment. What Eric's does is takes that low resolution scan on

rented through a number of different AI models as well as input as to what piece of anatomy you want to image and then redefines the acquisition plan for you. And what's nice about this is it reduces the time associated with figuring that out and on top of it reduces variability across technologist, and I said a picture here that's worth going. So this is the same patient at three different time points and as you can see in the smaller box Every time they laid in the MRI, their position was different but the acquired image

so the larger brain image that you can see across the time Point looks very very similar. Meaning that the acquisition playing was Define correctly, even though the patient was in a different position each time. I'm in so not only does this safe time reduce variability across technologist. It also allows us to compare against time once again meeting clinicians where they are in their own work clothes. Teresa restorative Dino's Concord close, but we've talked about ordering. I want to talk about some solutions

within scanning. And so I'll start with an ultrasound solution. And so this is a solution indebted within our ultrasounds that supports breast assessments using ultrasound and it's actually embedded in the device OS sonographer would be going about their exam and if they found a lesion they can highlighted and then allow a i within the device itself to run an assessment and provide a quantitative assessment against everything that has been trained on and

provide a score that is aligned to buy rats. And for those that are from their thyroids. Assessment criteria on the likelihood of a lesion within a breast to be potentially malignant or not. And so we found a solution. We're in bed DIY directly in the device. There's no significant training needed a sonographer would go about their normal workflow find the lesion and then be able to run the the AI right then and there so I do find this is another great example of applying

Nai within an existing workflow. Another one on the scan side eye is an MRI example. There's also an equally applicable CT example here, but using Nai, Not only to accelerate how an image gets taken but actually found image gets warmed and so we have a solution called Recon GL that actually uses deep learning and AI to form the image itself based off of the raw information from the in this case the magnets in the MRI and the result is crisper and cleaner images. It also means that the amount of scan time can be reduced offered of holding quality constant and

given that Mrs are expensive and device utilization is that are really important factor in the business associated with running an Imaging Center on this really has a lot of Value Inn in to make it tangible. Let me show another image here. So on the left is a conventional acquisition and on the right here is that The same patient but the Reconstruction so actually forming the MRI image itself. I used a i in so I'm not a clinician. But certainly it is pretty evident from the picture that there's a lot more clarity using the Deep Learning Network. And I think that this is something

that we have spent a ton of time understanding that the right way to do this to make sure that the images provide more information but are also a robust on and meet all the quality standards that we want and this is really exciting. This is an MRI example, but we also have deep learning based reconstruction. So into formation itself on the CT front as well. Let me jump to one more scanning solution. So this is a solution within our mobile x-rays and we Critical Care screening. I do think I

missed a really interesting and roll into the last place in the workflow that I want to talk about which is turnip recording. But this is a solution embedded in our X-rays and what it does is provide checks for both quality and potentially for critical condition to walk through with a flow would look like if you're a patient that needs a chest x-ray you would go in potential, you know be coming from the EDD or or some workflow that would get you to having to need a chest x-ray.

Have that done and immediately on the device itself. So with a technologist at actually acquiring me image, what would come up to his first equality check that says is the entire set of lungs because you're doing a chest x-ray within the X-ray if it's not like this and that would provide the opportunity to potentially image again, so that a patient doesn't need to leave find out later that the full lungs were in the image and need to go back for subsequent Imaging. So there's you up front of just checking to

make sure that the image that is acquired is correct on top of that it will also and Emma seems trivial but auto rotate the image to be upright and sometimes based off of patient position positioning that's an image that could be acquired may be upside down or oriented slightly different. We have a eye the auto-rotate thing set again seems very small. But if you are technologist, this will save you a significant amount of time and then lastly right now. We have a solution that checks for pneumothorax. And if

you're not a clinician like me and pneumothorax is that effectively a collapsed lung and patients that have new movie Thor Thor axis. I don't know if that's the right pleural. Are potentially critical and so actually the device can determine if there's a pneumothorax present or not based off of different work clothes with in different countries that may pop up directly on the X-ray or maybe information that is sent with a report on to be read by a radiologist later. Again. This is fitting

on the device actually in an existing workflow. That last piece that I talked about sort of embedding something in a report and actually interpreting. The images is the last piece of this Imaging workflow that I said I want to talk about for those that aren't familiar the reporting piece in the interpretation piece, especially within Medical Imaging take the form of a radiologist usually interpreting an image within packs, which is the general storage and viewing place for radiologist

to work through patients that have been imaged and in nose images need at interpreted all of that. Generally takes the form of images formatted in dicom clothes that aren't familiar can think of like jpeg or PNG used for those that are you know, when our technical debt and now she breaks down sometimes but I think it's just a point on that. We've now taking something from an Imaging device and put it in a common format and I think that's really important. Actually the reason it's important is it opens up a lot of opportunity and I'm going to bring us back to

actually where the money is spent again. I'm going to focus on the number that I didn't talk about before so certainly there's the value of a i and there's the funding for a IS350 number 350 companies founded in 2019 that are within Healthcare now a large portion of those are also working on Medical Imaging in Having that dyke on stair this right because this means that it's a jumping sort of En place for a number of these companies in allows work and development to go on

outside of companies that are manufacturing the devices themselves. What's the impact of that sort of 350 companies? It sounds great. It sounds like we're going to have a number of solutions out there. But the reality is is when we have that many solutions at it creates something for clinicians that they may not be happy with which is I need to assess potential E350 different solutions and I'm certainly not a doctor on some of you may be on the call, but I have friends and family who are

doctors and they generally don't have a lot of extra free time at work unless there's a particular solution or or model or application that is very near and dear to their focus area or research. It's unlikely that they're going to have the time or the desire to really assess which one of these many many applications they're interested on top of that steel Cycles are long. And getting on approved vendor list for hospitals can be difficult. So if we've got a ton of value being created by an ice small companies out there that have been funded especially in the

AI space. How do we get that back into the Healthcare System? How do we integrate them into the complexed work clothes that we've been talking about? The way we're approaching this within GE is the Edison ecosystem. And so GE wants to solve that problem by being a single trusted source that allows these companies out there that are building a solution to solve not only be upfront logistical problems on lung cell cycle and being an approved Hospital vendor, but also on Allow our technology to embed their Solutions

Within These clinical work clothes and I'll quickly talk about this in the packs context. And again for those that aren't familiar tax is where radiologist spend a large portion of their time interpreting images. So to walk through this workflow. We've got images being acquired in a clinical setting that land in packs. Once that happens, we've got a solution that we call our openai orchestrator and you can think of it as a set of business logic or a workflow that the diagram kind of shows here around

deciding if a particular study of particular interest has been acquired is relevant to an AI solution to give a real world example. Let's say a chest x-ray comes in two packs and say that you don't have those mobile x-rays with the pneumothorax detection that I mentioned before. Can we set up rules that say when a chest x-ray comes in two packs? We want to run. That AI algorithm that detects pneumothorax and will check to see what the skin is. And then let's run the appropriate AI that can also be true for on CT used for MRIs that maybe head MRI or CT is that should

run again stroke algorithm. And so can we provide that check up front Define rules and then integrate into all of these great companies that have been funded building a i Solutions Can we then? Return the results back in two packs so that they can be used in the existing clinical workflow. And that is what openai orchestrator does. The results themselves can be consumed in a number of different ways everything from workflow prioritization. So taking information that the AI solution has

produced and then determining that a radiologist work list should flag or move to the top individuals with a critical condition based off of the air results. Or or or and depending on the solution if a solution for example calls out a particular area of interest can we display that within packs? Can we show that to the radiologist within their existing workflow stand on all of those things to potentially provide mobile alerts are other ways of notification also be defined within that work flow within that

orchestrator workload self work running. The AI solution is one piece of it also allows us to decide what piece of clinical workflow we want this to impact so we can decide when I always run and how it's consumed. The end result and this is sort of what it looks like from a high level. GE can effectively deploy their own Solutions as well as third-party Solutions using this orchestrator and its logic to be consumed in a way that allows clinicians to not change their reading

and interpretation workflow. And really the end result is the ability to bring a number of different AI solutions that have been funded and how they're into something that can be used by anybody within a hospital system. Let me just pause and then say overall idea here is how can we activate all of the data that exists within effectively deliver AI into the different pieces of chemical work clothes that I talked about liberate caregivers allow them to not have to think about using AI but rather have a i seamlessly meet them

where they are now giving them more time to focus on patients and giving them the best information that they could have when making clinical decisions about their patients. That is all I wanted to touch on today iPad.

Cackle comments for the website

Buy this talk

Access to the talk “Workflow Integration in AI in Healthcare By Matthew DiDonato, Director, GE Healthcare”
Available
In cart
Free
Free
Free
Free
Free
Free

Ticket

Get access to all videos “Global Artificial Intelligence Virtual Conference”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Similar talks

Sanji Fernando
VP Technologies at Optum
+ 4 speakers
Matthew DiDonato
Senior Staff Product Manager - Artificial Intelligence at GE Healthcare
+ 4 speakers
Nataraj Dasgupta
VP Technologies at Rxdatascience Inc.
+ 4 speakers
Slava Akmaev
Chief Technology Officer at Scipher Medicine
+ 4 speakers
Shahidul Mannan
Head Of Data Engineering & Innovation at Mass General Brigham (Partners Healthcare)
+ 4 speakers
Available
In cart
Free
Free
Free
Free
Free
Free
Shahidul Mannan
Head Of Data Engineering & Innovation at Mass General Brigham (Partners Healthcare)
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Workflow Integration in AI in Healthcare By Matthew DiDonato, Director, GE Healthcare”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
566 conferences
22974 speakers
8597 hours of content