Leveraging AI to prevent heart attacks and strokes offers a significant opportunity to transform healthcare and make it more productive, personalized, and accessible. Joining me today is Todd Villines, the Chief Medical Officer of Elucid, a pioneering medical technology company at the forefront of AI-powered heart attack and stroke prevention.

We discuss how Elucid's FDA-cleared product uses cutting-edge AI to analyze and characterize arterial plaque through CT scans and the innovative aspects of Elucid's algorithms. We explore the role of machine learning in Elucid's technology, from identifying risky plaques to using fractional flow reserve derived from CT without invasive procedures.

Tuning in, you’ll learn about the importance of high-quality data annotation and the rigorous validation process required to ensure accuracy across various scenarios and demographics. We also unpack the company's approach to annotating data, avoiding bias, and using diverse data sets. To discover how Elucid is making strides in cardiovascular health and paving the way for a healthier future, don’t miss this conversation with Todd Villines!


Key Points:
  • Todd’s professional background and why he joined the team at Elucid.
  • Elucid’s mission and how it leverages AI for medical technology.
  • The role of machine learning in Elucid's technology.
  • Developing a fractional flow reserve derived from CT analysis.
  • Why data annotation is crucial for training the Elucid models.
  • The importance of validation and how Elucid ensures the accuracy of its product.
  • Challenges and limitations of working with CT images.
  • How the company’s technology integrates into the existing clinical workflow.
  • Metrics used to assess the impact of Elucid's technology.
  • Intelligent design, diverse datasets, and avoiding bias in AI development.
  • Discover Elucid's future outlook and its plans to expand.

Quotes:

“We’ve created proprietary algorithms that were trained by histology using traditional image processing techniques to recognize different types of plaque based on histology.” — Todd Villines

“In the field of medical imaging, using supervised machine learning and annotated data of very high quality and also generalizable to the clinical use case of your technology is vitally important.” — Todd Villines

“You can’t just go out and pick the very highest image quality to train your models or you’re going to end up with a very overfitted model that doesn’t generalize to the clinical use case.” — Todd Villines

“Just like any good clinical study, designing your AI technology is probably the most important thing. Spend the time upfront to get it right.” — Todd Villines


Links:

Todd Villines on LinkedIn
Todd Villines on X
Elucid


Resources for Computer Vision Teams:

LinkedIn – Connect with Heather.
Computer Vision Insights Newsletter – A biweekly newsletter to help bring the latest machine learning and computer vision research to applications in people and planetary health.
Computer Vision Strategy Session – Not sure how to advance your computer vision project? Get unstuck with a clear set of next steps. Schedule a 1 hour strategy session now to advance your project.
Foundation Model Assessment – Foundation models are popping up everywhere – do you need one for your proprietary image dataset? Get a clear perspective on whether you can benefit from a domain-specific foundation model.


Transcript:

[INTRODUCTION]

[00:00:03] HC: Welcome to Impact AI, brought to you by Pixel Scientia Labs. I’m your host, Heather Couture. On this podcast, I interview innovators and entrepreneurs about building a mission-driven, machine-learning-powered company. If you like what you hear, please subscribe to my newsletter to be notified about new episodes. Plus, follow the latest research in computer vision for people in planetary health. You can sign up at pixelscientia.com/newsletter.

[INTERVIEW]

[0:00:33.4] HC: Today I’m joined by guest, Todd Villines, Chief Medical Officer of Elucid, to talk about heart attack and stroke prevention. Todd, welcome to the show.

[0:00:42.8] TV: Thanks so much for having me, Heather.

[0:00:45.1] HC: Todd, could you share a bit about your background and how that led you to Elucid?

[0:00:48.6] TV: Sure. So, my background is training as a cardiologist with training in advanced cardiovascular imaging. In my career, I focused my clinical care on general cardiology and preventative cardiology, and my research was mainly focused on plaque imaging using cardiovascular CT.

What led me to Elucid is an interesting one. I spent my career doing clinical-based research and became very convinced that we were going about prevention in all the wrong way. Meaning that, we were focused on risk scores that were based on populations, they weren’t individualized in that we had so much more data and better methodology to predict who is in fact going to have a heart attack and stroke using CT scan and it all boiled down to plaque.

Plaque is the root cause of the majority of heart attacks and strokes and yet, we simply were not utilizing it. In fact, just even how we’re analyzing and reporting coronary CT scans as an example, it became very clear that the power of CT’s ability to quantify and characterize all of the plaque that we’re seeing and to use that to guide patient management, in a way that we’re just simply not doing today, and so I saw the potential at Elucid to do just that.

[0:02:01.9] HC: And so, what does Elucid do?

[0:02:03.3] TV: So, Elucid is a medical technology company. We’re based in Boston and we have an FDA-cleared product that can analyze and specifically, can characterize and quantify plaque in arteries. These are in using CT scanning and so these are CT scans or what’s called, CT angiography of the coronary arteries and other arteries in the body, and so you know, there are other software in the market.

But what makes Elucid unique and what drew me to Elucid is the fact that the algorithms that they developed were based on histology and so, what I mean by that is they weren’t just based on comparing to human reads, things that I could do manually. They actually offered an output that was unique and something I couldn’t do myself using current software or reading station-based software, and so Elucid takes these CT scans and analyzes them so that it assesses what’s in the wall of the artery.

And it characterizes the type of plaque because we know that the risk that a patient has for having a future heart attack or stroke is based on the likelihood that a particular plaque will rupture open and cause an event. We have very good data that it’s those risky plaques have a certain appearance on CT scanning, and so, and Elucid has refined that and taken that to, I think a higher level, you know, based on comparing that to histology.

And so, it can identify those risky plaques and then quantify how much of those risky plaques patients have to then present this back to patients and their care providers to allow them to make better choices both in how they’re living their life but also in the medicine and preventative strategies that they are contemplating.

[0:03:43.7] HC: And what role does machine learning play in this technology?

[0:03:46.2] TV: Well, really you know, there’s so many areas of medicine. Machine learning is foundational to making the software go, and so the plaque analysis, you know, the very core of our company is our plaque analysis and machine learning plays very little role in a sense in the fact that we take these images, we have a proprietary database of thousands of cross sections of arteries where we have histology samples that match.

And so in that sense, we’ve created proprietary algorithms that were trained by histology, using traditional image processing techniques to recognize different types of plaque based on histology. So, that part is more of a traditional image processing analysis but what we then have done as a company is said, “Okay, what can we do beyond just identifying and quantifying plaque and how can we use this plaque analysis to better care for patients?”

And one such way is we’re in the advanced stage of developing a plaque-based FFRCT, or fractional flow reserve CT analysis, and what this does is it uses machine learning. It integrates all of the plaque outputs that I just described that is already included in our FDA-cleared software, and then using machine learning can predict across the entirety of the coronary arteries what the invasive FFR or fractional flow reserve would be.

And just for our readers, fractional flow reserve is a critical decision tool that is used in the cath lab every day to decide who should undergo revascularization or in most cases, stent implantation and who should not, and so if you can get that information without having to go to the cath lab, without having to put you know, a catheter into an artery and do an invasive procedure, that tremendously improves decision making.

And is something that you know, currently, in the field, there’s simply not an FDA-approved plaque-based FFRCT technology that’s out there and it’s based on the understanding that it is again, the amount and type of plaque in the wall of the artery that is directly predictive of changes in pressure or fractional flow reserve, if you will, across the course of that artery and this is based on long-standing, you know, basic science studies.

As well as the understanding of how plaque influences the lining of those arteries and their ability to dilate and what we call a vasodilatory capacity related to endothelial function, and so we use machine learning to take all of those outputs from our software and to integrate them across the entire coronary tree. These have been trained on thousands of invasive fractural flow reserve studies where we’ve taken patients who have gone to the cath lab.

We have their fractional flow reserve, and in many cases, we have fractional flow reserve data for the entire pullback across the entire artery. We have their CT scanning and so we can, in a deterministic way, train an algorithm to recognize that relationship between plaque and fractional flow reserve in a very accurate way and trained and validate plaque plaque-based FFRCT model.

And so, this is a primary way in which we use machine learning to then know we’re doing some advanced development work. Also looking at the predicative accuracy of plaque risk using machine learning, again, based on those plaque analyses, the plaque inputs to a model but those are kind of more, I think, advanced development technologies right now that we’re working on.

[0:07:03.6] HC: So, the data that you train these algorithms on, as you mentioned, you’ve got CT, histology, and your fractional flow reserve data. What needs to be annotated in that? Is it on the histology side annotate the different tissue types or is it something else?

[0:07:17.6] TV: Well, for FFRCT we realized that you know, we, from a machine learning perspective, the inputs to the model are our histology-based plaque outpost. In developing the plaque analysis software, we included the damage and these thousands of cross-sections, these are annotated by pathologists.

And so, in you know, with the software that’s been shown to be highly accurate against pathologists, we can take the output of our plaque model and then use those annotations as inputs into the machine learning FFRCT models. So, that’s exactly what we do, is use those annotated, those plaque annotations from our FDA-cleared histology-based plaque analysis software as one of the key inputs.

Now, we do this down the entire coronary artery and build a model that adds to that based on a host of other variables from the CT scan itself and potentially even from the patient themselves.

[0:08:15.4] HC: Well, the pathologists do the annotation that’s involved here, do they always agree on exactly how to annotate? I’ve seen some applications where you have very well-trained pathologists who are experts in their fields, but there’s still room for subjectivity.

[0:08:31.2] TV: Well, like anything in life, you wish things were perfect but you know, you’re right. I mean, when someone looks at a slide, exactly what they label as liquid aristocratic core or other types of plaque, there can be some subjectivity and so as part of our cleared product, we had multiple pathologists annotate slides and where there was disagreement, those were resolved by consensus with a third pathologist.

So, it’s important. I think you raise a real critical point is that you really – in the field of medical imaging, using supervised machine learning and annotated data of very high quality and are also generalizable to the clinical case, use case of your technology is vitally important. So, that certainly was taken into account as we developed our plaque analysis software.

[0:09:17.6] HC: Once you’ve trained machine learning models that’s you know, to the best of your ability on your training data, are very accurate, how do you go and validate them? How do you make sure that they’re going to work across a diversity of different scanners and hospitals and patients that they might need to?

[0:09:33.1] TV: That is crucial. I mean, if you’re going to you know, help patients, if you’re going to use your technology to improve care, you really have to have confidence in its accuracy and so, first off, I think it’s important to take those considerations that you just mentioned, different scanners, different degrees of disease severity, different demographics, patients of different body sizes which can impact image quality, variety of image quality.

It’s important to ensure that you have those variables adequately captured in your training data or it simply wouldn’t be generalizable. So, we have developed thousands of cases now across, using every different available CT scanning platform, across patients with no disease versus advanced calcified disease, and different degrees of stenosis across the entirety of the coronary tree, so looking at all the different segments of where you might find disease clinically.

So, it’s important to have all of that in your training data and of course, for validation, you know, it’s really important that you go out and these are, you know, cases where you haven’t seen them. They’re firewalled, they’re typically done in a multi-center study, where you ensure that you’re testing your device on a patient population, you’re most likely to see in clinical use where you have the variability in the scanner technology, disease severity, et cetera, to ensure that you have hit the mark.

And so, that is something that you know, we are currently in the late stages of our validation, multi-center study for our FFRCT software and those are all things that we have taken into account in designing the study. Obviously, you have to use the strictest and highest methodology with regards to good clinical practices if you’re into running a high-quality clinical study and so, you know, things like ensuring that there’s strict blinding of your data, that you use, you know, the very best practices with the core lab an external core lab, who does all of your clinical interpretations of CTs and co-registration of invasive FFRs, locations.

Ensuring that you know, have strict electronic data capture, ensuring that you’re getting the various highest quality data in a validation study. How are the ways you do that? And that’s a prerequisite really to doing any high-quality validation study and we’re really optimistic about the study that I just mentioned, which is again our, we’re in the late stages of a multi-center validation study of our plaque-based FFRCT product.

[0:12:02.5] HC: So, we’ve talked about some of the sources of diversity in the imagery. Are there other challenges that your team encounters in working with and training machine learning models on CT images?

[0:12:13.8] TV: I think the real – you know, obviously from a deep learning perspective, we want to ensure that we have the best deep learning scientist, that we’re using the very best models but we’re not locked into one approach, and so being, I think intellectually curious. You know, so I think it’s important that there are a number of different AI model architectures that you can use and so it’s something that we’ve been –

And I say that because, if you think about the volume of data that is contained in a CT scan, you can really get down to the pixel and subpixel levels, and so you know, I mentioned the plaque data but you know, there’s the opportunity of using three dimensional CT inputs into models, and so you know, one of the things we have done is we’ve learned along the way. We’ve published a single-center study where we showed that at the lesion level, you could use very simple, plaque inputs to be pretty accurate for predicting FFR.

But in fact, we built on that and so the model that we originally developed, we’ve now, I think, made it better by including some additional plaque inputs. You know, ensuring that we do a good job at the very basics of how we process CTs, ensuring that the segmentation of the lumen wall is as accurate as possible, and that’s something that the software has gotten better at over time.

And so I think you know, ensuring that you’re intellectually curious in which AI models that you’re utilizing, that you are ensuring that you’re including deterministic way, the appropriate clinical and imaging variables is probably the best way to go about this and I think that like, anyone who is doing machine learning, I mean, the key is really ensuring good quality input data.

And I say “good quality” you have to be careful, it should still be representative of you know, what you’re going to see in the clinical space. You can’t just go out and pick the very highest image quality to train your models or you’re going to end up with a very overfitted model. You know, that doesn’t generalize to the clinical use case. So, we’ve tried to be very deliberate in considering all of those things.

[0:14:07.4] HC: Medical applications are very complex. How do you ensure that the technology your team develops will fit in with the clinical workflow and arrive at the right kind of assistance for doctors and patients?

[0:14:18.4] TV: Yeah, so, that’s something that’s very near and dear to my heart as a clinician and you know, as a practicing cardiac imager and someone who spent years and years running a CT lab and reading CT scans, we know that we’re not going to be able to help patients if our technology does not improve efficiency and make the lives of patients and especially the workflow easier for clinicians who are at the scanners, who are at the bedside or taking care of patients. And so, if you think about it, it’s one thing to go out and create a great problem, right? And say, “Hey, we’re super accurate at evaluating FFRCT” but you’ve got to ensure that it’s easy to use and it gets the results back in a timely manner, and so the beauty of what we are creating is that we know there are other products in the market now where the workflow of doing a scan, saying, “You know what? This is a scan where we need FFRCT” as an example, “We need advanced plaque analysis or both.”

The workflow has already been established, where sites are comfortable sending you know, with appropriate security and encryption, measures in place, that they send us data and we can analyze that and get a turnaround time that’s under two hours to where they can get these results back and in the hands of the decision-makers and in the hands of the patients and their providers.

Now, I mentioned under two hours, where obviously, we have plans to make that market lead shorter turnaround time going forward and this is where the era of, you know, again, cloud computing and GPU technology has really, really I think going to revolutionize that turnaround time. So, we’re very cognizant of the importance of ensuring that the workflow makes things better not worse for everyone involved.

[0:15:57.6] HC: Thinking more broadly about what you’re doing at Elucid, how do you measure the impact of your technology?

[0:16:04.1] TV: Well, I think it’s measured in several layers. I think, first off, you know, you mentioned the validation, is it accurate? You know, there’s an appropriate level of healthy mistrust. As a clinician scientist, I’m always taught, you know, show me the data, right? Show me the data, and so I think we want to be able to show people convincing data, that what we are measuring that is powered by AI ML is actually, it has a high degree of confidence and can be trusted in shown to be accurate.

That’s I think one of the first things is building that trust. Now, from an impact standpoint in the clinical field, we know, for example, with FFRCT that if you’re doing this well, you can safely save patients trips to the cath lab, unnecessary trips to the cath lab.

If you look in the United States today, more than half of patients who go to the cath lab following stress testing, have no significant coronary disease, did not need to go to the cath lab, and that CT combined with FFRCT can reduce that number by 50 to 70% if not higher, and so one of the things that we would do is getting feedback from our customers how are we doing, okay?

There are patients who are using our technology, you know in patients who do go to the cath lab, how is the accuracy of our technology? We will, you know use FFRCT for example and patients will go to the cath lab, they’ll have invasive FFR measured, let us know how we’re doing and we will continue to be a learning organization but then of course, there’s these additional studies as well.

You know, looking at things like efficiency of care and value-based care and so by using a technology like this, what is the value you bring to organizations through better-selecting patients for the cath lab avoiding unnecessary procedures and those are things that you can really measure. You can measure normal cath rates, you can measure the proportion of the patients who go to the cath lab who need revascularization, who receive stenting, as well as the patients where you have a concordance in your FFRCT value.

The other is plaque and we know again, you know when patients today are getting coronary CTs, the measure of plaque is currently semiquantitative, it’s kind of visual. It’s things like calcium scoring or just looking across the corners and describing in qualitative terms how much plaque you have and we really believe that quantifying that, the impact of that is something that’s also important to measure and so that’s where I think additional studies are needed and we’re excited to do those additional studies, where you introduce these plaque values, you measure how our patients and their providers reacting to those.

Meaning, are they using it to change medical therapy and how they’re counseling patients and so that’s another way I think to measure this. Now, we can measure this in obviously prospective studies but also I think measuring this in real-world studies because we have people who are already using this in clinical care and so we want to know as a company how are people reacting and utilizing this data we’re providing.

[0:18:52.7] HC: Is there any advice you could offer to other leaders of AI-powered startups? [0:18:56.7] TV: I think the key question is AI at the end of the day is a tool, right? It is a method to be more efficient, it is a method to build better models, to build better algorithms and so it ultimately, I think for any leaders of AI-powered startups or people who have great ideas, right? It’s the idea, right? And so the interstate and mind of what do you want to provide and then AI is usually a way to do that.

It’s not the answer itself and so I think it’s important to understand what really AI does for us and how it can also, I think it’s important to understand where you can get it wrong, right? We know that there can be overfitting and bias in AI and it really starts with an intelligent design and I say intelligent design, understanding you know, what are the inputs to my model going to be, what are the biases in those models?

Am I going to do just kind of black box unsupervised learning? Well, if you’re going to do that, there may be value in that in certain applications but you got to realize there’s going to be a huge amount of data required to do that. At the end of the day, always some questions about it, you know, interpretability of such an approach and you need to ensure generalizability. So, I think just like any good clinical study, designing your AI technology is probably the most important thing.

You know, spend the time upfront to get it right, to ensure you’ve got the diversity that you need in your datasets and your algorithms, you know, to avoid bias and to ensure generalizability and of course, you know if you’re going to annotate data with supervised learning, it’s really important that you do that well. Again, we’ve all heard the concept garbage in, garbage out, and it’s very true and in medical imaging, I think it is especially true because there is a background level of noise in imaging in some degree of subjectivity. So, being I think really deliberate in that is going to pay off for these leaders.

[0:20:45.6] HC: And finally, where do you see the impact of Elucid in three to five years?

[0:20:49.8] TV: Well, I really think you know, if you think about the field, I said we’re doing it wrong in area of prevention. I mean, what we’re doing today is we’re taking patients and we’re saying, “Okay, how old are you? Did you smoke today? What was your blood pressure today?” We ignore prior severity of blood pressure, we don’t take into account where their lipids were over the course of their lifespan.

We ignore environmental and genetic interactions and so where does Elucid fit? I think we’re moving to a field of prevention where we are utilizing plaque. The safety of coronary CT and geography is extremely well proven now. The radiation doses are extremely low, they’re as low as mammography if not lower, and so we now have the potential to enter an era in three to five years, where the evidence supports it.

We can utilize plaque-based imaging to better care for patients and Elucid can be part of that by measuring plaque in a quantitative way as the only software that can do this validated against true histology but also being able to use that plaque to provide things like plaque risk and FFRCT and so I think that’s where we want to be in three to five years as well as expanding what we can offer to patients and their providers.

[0:22:06.3] HC: This has been great. Todd, your team at Elucid are doing some really interesting work for cardiovascular health. I expect that the insights you shared will be valuable to other AI companies. Where can people find out more about you online?

[0:22:18.4] TV: Please check us out, we’re at elucid.com, and look for us at imaging medical meetings and anything that has to do with heart attack, strokes, or plaque.

[0:22:26.5] HC: Perfect. Thanks for joining me today.

[0:22:28.5] TV: Thanks so much for having me, Heather.

[0:22:30.4] HC: All right everyone, thanks for listening. I’m Heather Couture and I hope you join me again next time for Impact AI.

[END OF INTERVIEW]

[0:22:40.8] HC: Thank you for listening to Impact AI. If you enjoyed this episode, please subscribe and share with a friend, and if you’d like to learn more about computer vision applications for people in planetary health, you can sign up for my newsletter at pixelscientia.com/newsletter.

[END]