HLTH 2024: GE HealthCare’s Chief AI Officer Says Future Of AI No Longer A Black Box

Medtech Insight talked with GE HealthCare’s chief AI officer Parminder “Parry” Bhatia at HLTH about the firm’s new CareIntellect for Oncology offering to help clinicians make efficient use of multimodal patient data, his vision for projects within AI Innovation Lab, and the future of AI in health care.

(GE Health)

GE HealthCare has been on the forefront of bringing artificial intelligence into devices, according to Parminder “Parry” Bhatia, GE HealthCare’s chief AI officer.

In the future, AI is poised to transform health care by integrating multimodal data from text, images, audio, doctors’ notes and lab results and extracting meaningful insights for clinicians. This has the potential to help doctors make faster diagnoses and more accurately predict how patients will respond to treatments and thereby improve outcomes.

In particular, Bhatia predicts agentic AI, which enables machines to act as autonomous agents capable of complex decision-making and adaptive learning, is going to have a tremendous impact on health care.

GEHC - Executive Leaderdhip Team (© 2023 Rubinic Photography, Inc)

“One of the key differences with these [agentic AI-based] technologies is explainability or reasoning,” Bhatia told Medtech Insight. “It’s not just giving the output, it is telling clinicians why it is giving that output, which gives them more confidence into these capabilities as well. And that’s a big change from AI from the past. When we think about traditional AI as well, one of the challenges was that most it was a black box. You would try to add some of the reasoning, but these [new] models come with reasoning for themselves.”

Bhatia joined GE HealthCare last April from Amazon, where he served as head of machine learning for large language and foundation models. At GE HealthCare, Bhatia leads the integration of AI into the development, manufacturing and operation of medical devices and is responsible for overseeing strategic implementation of AI technologies to improve patient care, operational efficiency and outcomes.

At the recent HLTH conference in Las Vegas, Medtech Insight caught up with Bhatia to talk about GE HealthCare’s new product, CareIntellect for Oncology, a cloud-based application that brings together multi-modal patient data from disparate systems and uses generative AI to produce clinical notes and reports, and AI Innovation Lab, which aims to accelerate early concept AI innovations within GE HealthCare, and his vision for AI going forward.

This interview has been slighted edited for clarity and brevity.

Q

Medtech Insight: Can you talk about your vision for the AI Innovation Lab?  

A

Parminder Bhatia: The whole idea with AI Innovation Lab, how we start to show some of the early concept AI capabilities, is that it can potentially transform health care in a lot of ways. A lot of these are early projects – some are technology in development, whereas, in other places, we are looking for collaborative partnerships, as these things can be brought in.

As an example, as we have looked across these different areas, a lot of these problems, even from the device perspective, the reason digital and AI becomes a key component is it helps you to accelerate the multimodal capabilities.

One of the first innovations is called Health Companion, which is a next-generation technology that is powered by agentic AI. These systems are more autonomous or adaptive versus the technologies that have been built in the past. They can go further than traditional AI by making recommendations to take action.

When an oncologist is trying to look at a patient, one of the hardest problems in cancer treatment is to see disease progression. For them to be able to treat cancer depends on the stage of the cancer – is it increasing, decreasing? In the past, it required looking into multimodal data, which means looking into your EHR data, imaging data, lab data sets, pathological data, even genomics, and [insurance] coverage from a financial perspective. You have to look into multiple dimensions of data.

For critical cases, an oncologist would have a tumor board. These are experts that provide input from the clinical side. Now imagine how AI can combine this data from these different modalities – clinical, biochemical, and others – and provide inputs by summarizing that data in a meaningful way for clinicians. With agentic AI, each agent is like a specialist. There might be an agent that specializes in radiology, another one specializes in pathology. And this system is able to come up with recommendations. Think of it as a multi-disciplinary, virtual tumor board at the doctors’ fingertips.

Another project in the AI Innovation Lab is around perinatal care. There is no other joy than holding your baby for the first time, but the amount of anxiety that you go through during labor is huge. At GE HealthCare we realize that perinatal care is complex, but in collaboration it can be streamlined. The initial work we have done is streamlining things like what hospital protocols a clinician needs to follow during the labor delivery and during shift changes. Clinicians have to look into patient data, the device data, the readings, the SP02 (oxygen saturation) and other things that are coming out of the vitals and patient history to decide on the next action.

We are looking at multimodal data summarization in that area to streamline this operation. These technologies – foundation models, generative AI – allow us to do things that could not have been done in the past. The AI Innovation Lab helps us streamline these things and test things with our partners and see how they are actually solving their use cases.

Q

When you talk about autonomous AI, what does that mean exactly?

A

Bhatia: Autonomous means all of these systems are able to independently look into these different systems. When the new data on the clinical data comes in, it’s able to adapt based on the new data that comes in. Let’s say the patient’s tumor size increased in the most recent study. The system is able to take that data and say ‘earlier there was no metastatic cancer, but now there is metastatic cancer.’ You’re not having to update the model or retrain the model. As soon as the new findings come in, it is automatically able to adapt and learn from its environment.

Q

How do you eliminate the risk of hallucination with autonomous AI?

A

Bhatia: The AI is pulling data from what already exists. It also has guardrails of explaining why it is doing that, as well as grounding to the document from where it got the information from. It will say the PSA value went up from 8.3 to 10.4 and that’s why there is a risk of metastatic cancer. It will also point to the document where it got the information, the source document.

Q

Studies have shown that clinicians are apprehensive about using generative AI because of liability risk? Where do you stand on that?

A

Bhatia: These things need to be built out collaboratively. When the clinician sees something has substantially changed, they can go back into the document. It also gives reasoning. One of the key differences with these technologies is explainability or reasoning. So it’s not just giving the output, it is telling clinicians why it is giving that output, which gives them more confidence into these capabilities as well. And that’s a big change from AI from the past. It also ties to the guidelines. So it’s giving them more information than they would have gotten if they were just surfing through the documents. At the end, you’re talking about reducing their cognitive overload.

Q

Will the Health Companion be able to aggregate patient data even from disparate sources beyond the hospital walls?

A

Bhatia: That’s what we are evaluating. Initially it [the AI] pulls from the data sets inside a hospital or the research site. When we talk about Health Companion, that’s still earlier in concept. It could be integrated into CareIntellect, which is already a product and will be commercially available in 2025.

Q

Can you provide some more detail on the CareIntellect platform and the CareIntellect for Oncology application, also unveiled at HLTH?

A

Bhatia: CareIntellect, our digital cloud-first technology, has similar capabilities as Health Companion where we bring data from multiple sources together.

CareIntellect’s initial focus will be on oncology. We are currently evaluating CareIntellect for Oncology with the University of Texas, Southwestern for prostate cancer and with Tampa General Hospital on integrating their data on breast cancer patients.

We are working with both of these early evaluators on integrating their multi-modal data. It’s a hard problem, as you pointed out. But once we are able to integrate the data, and it’s done properly, we can think of adding new applications, which could be in cardiology, neurology and other spaces.

Q

Why did you decide to start with oncology?

A

Bhatia: Because of the high need and because of the patient journey. As the patient is going through the treatment journey across four to five years, they will have thousands of clinical notes that have been created. How to summarize those notes is one of the hardest problems for doctors. We wanted to work on some of these problems to bring in value.

This is something that we have heard from oncologists as well. They told us it takes a lot of time to combine these data sets. On average, even within their hospital, these clinicians have to go into as many as 18 different systems to aggregate all this data – lab results, electronic health records, pathology, and so on – to get an overview of the patient.

It is a big problem for them to combine this data and summarize it in a way that they can decide what the relevant action needs to be for the patient.

Q

Do you feel GE HealthCare has an advantage because you already have the most AI-enabled medical devices in the US cleared by the US Food and Drug Administration, including solutions for ultrasound, patient care, imaging and diagnostics already in place?

A

Bhatia: A lot of these devices are integrated into the workflow. GE HealthCare’s Effortless Recon DL of deep learning solutions has gone through 34 million scans worldwide. Over the years of legacy of building these technologies with our partners, with adhering to all the responsible AI practices, it’s built enough trust that they are able to collaborate and work on these problems with us. We have another digital product called Command Center which looks into operational efficiencies – so less on the device side – but it helps streamline hospital operations, like ‘can we bring a new patient into the hospital’ and patient flow. Command Center has been deployed in more than 300 hospitals. That’s the scale of capabilities we are building out – things that go into devices as well as into the digital offering.

More from Conferences

More from Leadership