- Episode Topic
Artificial Intelligence (AI) is computer software capable of dealing with new situations and finding it’s own answers, as opposed to pre-programmed solutions. It is being introduced in image processing of x-rays which has the potential to dramatically change medicine, and the work of x ray techs.
- Lessons You’ll Learn
AI is changing how X-ray techs work with digital X-rays. Today, X-Ray systems integrate advanced computer algorithms to enhance image interpretation. It is important to understand that AI will not replace the X-ray tech job, but rather enhance it and allow them to focus more time on human value-add tasks such as patient care. Instead, it works alongside them as an auxiliary tool to enhance accuracy and speed. AI software assists x ray techs by highlighting potential anomalies on a radiograph, significantly reducing the time to read and interpret the images. Learning how to use AI will be essential for all X-ray techs in the coming years.
- About Our Guest
We interviewed thought-provoking and RadTech enthusiast Cécile Hourquet from Gleamer. Gleamer wants to companion imaging users with a suite of Artificial Intelligence solutions covering a wide range of clinical applications.
- Topics Covered
This deep dive will explore all aspects of Gleamer AI’s X-ray technology. We’ll talk about how it works, why it’s important, and what it means for the future of healthcare and x ray technologists. Plus, we’ll give you a sneak peek into some of Gleamer AI’s exciting new developments, such as its built-in triage feature. This enables it to recognize 70% of typically normal radiographs, thereby speeding up patient discharge. In ambiguous cases, the software can recommend a repeat radiograph, enhancing the efficiency and accuracy of the diagnostic process.
Our Guest: Cécile Hourquet, Gleamer AI
Today, we’ll be delving into the exciting world of radiology and artificial intelligence with Cécile Hourquet. Cecile holds the important position of Marketing Manager at Gleamer AI, a company at the forefront of integrating artificial intelligence (AI) into radiology.
Before joining Gleamer AI, Cécile had a fascinating career in improving everyday working conditions for lab techs through digitalization. She was drawn to Gleamer AI due to the compelling mission of the company and the vast potential of its product to transform healthcare for patients and medical staff.
[00:00:00] Cecile Hourquet
If we miss it at a moment that can be crucial for you. It can go deeper and you can lost so many chances just with six months delay about your problem. Where the AI can help is basically first of all for location that are not that easy. So behind the shoulder where sometimes with the fatigue or whatever, you will not see all the nodules that you can spot. But there is also the fact that sometimes you don’t have only one nodule. And we can see that with multiple findings. No matter which AI tool we are talking. The AI is helping a lot because you have the bias of satisfaction.
[00:00:40] Jennifer Callahan
Welcome to the Skeleton Crew. I’m your host, Jen Callahan, a technologist with ten plus years experience. In each episode, we will explore the fast paced, ever changing, sometimes completely crazy field of radiology. We will speak to technologists from all different modalities about their careers and education. The educators and leaders who are shaping the field today and the business executives whose innovations are paving the future of radiology. This episode is brought to you by X-ray technician Schools.com. If you’re considering a career in X-ray, visit X-ray technician Schools.com To explore schools and to get honest information on career paths, salaries and degree options. Hello, everybody. Welcome to the podcast The Skeleton Crew.
[00:01:32] Jennifer Callahan
Thanks so much for being with us. And today, our guest that we have with us is Cecile Hourquet. She’s joining us all the way from Paris, France. I’m truly honored that she’s taken her time. And we have we’re able to coordinate our schedules with six hours in between us. Cecile is working for a company called Gleamer AI. It is on the forefront of integrating artificial intelligence into the world of radiology, which will aid in the accuracy and the reading of different radiographs. So she’s going to talk to us today about everything that has gone into making such a wonderful company and all the advancements that they’re making in their technology. So thank you so much, Cecile, for being with us today.
[00:02:12] Cecile Hourquet
Oh, thank you, Jen. It’s a pleasure for me, too. I’m really happy to have the opportunity to, like, talk about Gleamer and what we do.
[00:02:19] Jennifer Callahan
Yeah, so let’s start there. Basically, if you could just give us a little background of yourself and then how you kind of found yourself working for such a forefront type of company that’s making so many advancements.
[00:02:32] Cecile Hourquet
First of all, we maybe share what I do at Gleamer. I’m the marketing manager of Gleamer, so my job is to talk about the product and know how to explain what we do, but mostly also to have a better product. So I’m also in charge of the development and pushing what are expecting and see what we can do forward. Before joining Gleamer, I was doing the same, but for the lab tech I was looking for like how digitalization can help them to have a better, I would say everyday work and how to improve their condition in general. And Gleamer just show up to me. They offered this job and I found the mission like so great. The people are really passionate about it and the product itself is so nice and changing so many things for health care patients, but also a worker.
[00:03:18] Jennifer Callahan
So let’s talk a little bit then exactly about Gleamer and the artificial intelligence. Some people might hear artificial intelligence and it gets a little freaked out. You know, like, is there a computer who’s just going to be reading my x ray, my radiograph? Is there no human interaction here? But I was looking over the website last night, going over the information that we were going to be discussing today. And I really have to say that it is astounding the stories that are related with your company. So could you just walk us through the beginning to where we currently are in the development of the AI portion?
START OF TRANSCRIPT
[00:03:51] Cecile Hourquet
So how do you start? It’s a big journey, actually. But basically what happened is that the idea was first for my co-founders, which is Christian and Alexi, to build something in healthcare. They were really passionate about how AI can help, and if you want to understand how it raised up, you have to understand the situation in France, how it is working for us. We have a huge shortage of radiologists in some hospital. It’s only physicians that are reading the x rays and they are good, but they don’t have the, I would say, experience that are supposed to have in MSK. So they found out this fact about people not being seen by radiologists and not having the same level of care depending of where you were going. In France, especially in this area where there is more shortage than other for obviously like in Paris is better than some other region. And they found out that why not help with the AI? So they built this tool in collaboration with the third founder that arrived that is Noridian, which is a radiologist, especially with MSK. So he helped to build this wonderful product and they started to have the idea to have the fracture detection. And little by little, the algorithm grew up. And what is really funny is that they have this story. When they explain that when they start the algorithm, they knew that it will help. Like AI is really strong to find things and to it’s a machine learning with image. So they knew that it will obviously help because human is human.
[00:05:15] Cecile Hourquet
So sometimes we are tired or we are just overload with work. We need to go too fast. Or sometimes it’s just really subtle and I can apps on that. There is this funny story where they were thinking that it will help or maybe increase of 12% the sensitivity or 12% more like less missed fracture. And when they found out with the clinical study that it was 30% even then were not believing that their eye was so powerful. And we help even the experts, radiologists. So it’s like that. And you ask me, is it to replace or whatever. I totally understand. Like there is a full revolution which is going on with AI. Sometimes I’m joking about the fact that when people ask me what I do, that it’s like if I was working for Apple 20 years ago when everybody knew about a computer but didn’t use it everyday, this is I think that in 20 years it will be a total normal thing and we will have discover how to work with it, what is the benefits, but also how to understand it for our case is totally different because we are a companion algorithm. It means that the algorithm still can do mistake and it’s here to speed up the reading time. He’s here to spotlight and say, I think that there is. Something here. But there is still a way where the human is still here to choose if it’s correct or not.
[00:06:31] Jennifer Callahan
So basically the radiologists like say, I’m a radiologist and I’m sitting here and x ray gets passed through to me, say, from a person from the E.R., they fall in their hip hurts, but I’m still going to do the regular reading on it. I’m looking at it, but to my plain eye that I’m looking on my screen, I do not see a fracture at all. And maybe it could possibly I know from experience working in radiology that sometimes things are too subtle, as you had said, or there might be swelling or maybe fluid is is distorting the image over top of it. That to a human who is looking at it, they might do a misreading at the initial time of seeing the x ray. So you’re saying that this software is a companion to an actual person who’s working and reading it, so they might do their reading and to their plane? I possibly there’s these different factors worked into it that they don’t believe that there is a fracture, but they then might do the second portion of it with glimmers software and the AI software is finds a small spot on, say, the hip, that the neck of the hip is actually fractured. So just to explain everyone what exactly the software is doing and Cecile is saying that so many different things like these, this can be missed on radiographs, not because the radiologist is doing a poor job or that they’re not trained properly, but more so because there are these different discrepancies that can go into it. You know, as I said, like swelling patient motion. Sometimes patients are in a lot of pain and they’re moving around. The image isn’t the best. And sometimes I know from experience, again, working in an E.R. as a radiologist excuse me, as a red tech, sometimes you’re just getting the best picture that you can because the patient’s pain level is too high. So do you want to talk on a little bit more about how, like the companionship of the radiologist and the software would progress forward?
[00:08:24] Cecile Hourquet
You did describe it very well. So basically what we do is that we take the Dicom, we the algorithm, analyze it and send it back the results and people can see, okay, first the Dicom without it, and after the image will be added through the PACs. So it’s an image process. So we take it and the algorithm learn patterns at the pixel. If we can say like that and you can. So as you say, we can maybe see if there is any effusion, like things that it learns that when there is this pattern, it means that it’s that. And to give you like some example, you talk about the hip, but for example, some people can be confused and thinking that it’s arthrosis. And in reality, no, it’s a fracture. It’s exactly in this type of cases that the algorithm can help when they see. I think that there is something that I can’t really see it properly, but it’s like a companion. You feel like more reinsure Sometimes it can helps to just put something that you didn’t. So for example, for kids, we have been FDA approved for kids like they can explain like my pain is here and in reality it’s more at the end of the metatarsal things like that. It really helps to win time to spot on things that is hard for kids or old people. So I have a tendency to have like a problem to describe where is the pain? And it’s exactly that.
[00:09:37] Cecile Hourquet
It’s just to make one time to reinforce things that people can see and to also just see also the negative part, which means that in terms of throughput, you started to go to the subject is really interesting because you will have a triage aspect also of the algorithm where normally like 70% of your patient are normal. Luckily hope obviously if you are working in your maybe you have, but it’s what we say as a natural prevalence. So it’s also interesting to see that these 70% of person will be seen as negative and you can save time about knowing and discharge them. So when you know that a patient which is here has a cost also for the health system is really important to speed up, to have a better treatment of the patient that needs a care right now. So after you will have the second aspect that the people that are unfortunately positive for them, they had a bad day. It’s also about like treating them better do not missing them, which is important. And I know that sometimes we can also say, okay, for things that are doubtful. For example, it also helps you to be like, okay, this person I can I can maybe redo a radio and also win time in the process.
[00:10:49] Jennifer Callahan
Right? Let’s touch on so you have the bone view trauma AI software. But I found particularly really interesting was the chest view. Ai So everyone that’s listening, the chest view, AI and the bone view was kind of similar along the lines of like they’re doing the same thing, that they’re using an algorithm that. Is going to look at these images. However, for me, I found it personally really interesting because some of the case studies that is on Glamour’s website, it can detect cancer way earlier than what I hate to say human possibly could because something could be so minute and so small that to a human normal eye it could be miss. And it’s not until it becomes almost larger that it would be picked up on. So do you want to go over one of those case studies to SEO for us to talk more in depth about it?
[00:11:37] Cecile Hourquet
Well, it’s difficult because we don’t have the image, so you will have to trust me. Yeah. It’s not like for you where we were dipping into cancer and where if we miss it at a moment, that can be crucial for you. It can go deeper and you can lost so many chances just with six months delay about your problem. Where the AI can help is basically first of all for location that are not that easy. So behind the shoulder where sometimes with fatigue or whatever, you will not see all the nodules that you can spot, but there is also the fact that sometimes you don’t have only one nodule, and we can see that with multiple findings, no matter which tool we are talking. The AI is helping a lot because you have the bias of satisfaction. You think that you find one, you check this one and in reality you had another one that was the Mile and one. So in this step and unfortunately for us, it happens sometimes that when we install the tool, you just need one nodule that should have been missed for people to really trust the algorithm. And sometimes they’re like, Can you try an old for example, this case study is exactly that someone that was missed, the person come back after and it’s too late or it’s already way advanced. It happens to us, like we say, three times a year that people are actually asking us like, oh, can you check this old file that I have? Because I wondering if we would have done differently. And it’s pretty rare that it’s not positive actually, that the algorithm doesn’t spot it. And sometimes we don’t know. They were tired like human, human, like mistakes can happen every day. And it’s exactly what we try to avoid is exactly like what we say. The safety net is like that You should see it in general.
[00:13:17] Jennifer Callahan
Actually to talk about that. Is there a discrepancy, possibly like possibility ratio for the AI software? Does it ever highlight something that they think might possibly be abnormal and it’s not abnormal?
[00:13:30] Cecile Hourquet
Yeah, it can happen. Like we did a huge clinical study that when the Margolis Awards last year, which was like tremendous for us and we show that the algorithm can get wrong, it happens. Why? Because as you said, we can have artifact, for example. It’s still a process of image is for that that human will always be in the center of the decision. Like even some algorithm tried to, I would say, optimize like the reading. And in reality they do it only for 30%, where it’s maybe 70% of them that they are normal because the eye can see plenty of things well after. It depends on the quality. This is actually our job to make it as precise as possible, but it’s here that you have to see. First of all, there is a number of findings that you are trying to screen. It has to be targeted, but also more. You have more you will have false positive or things like that. And definitely you need to check for the clinical study. This is where I would say we have to work our best to be that. And I think we need to be honest with the fact that, yes, I can do false negative, false positive. It’s our job to don’t do them, like to make it better and better. But image would always be an image. So if you fake it, it would be different, right?
[00:14:43] Jennifer Callahan
So as I was saying that the company is based out of France, is this software currently being utilized by many of the clinicians in France at this point?
[00:14:53] Cecile Hourquet
Oh, it’s actually integrated in a lot of countries, including us. So we have our FDA approval. We’re really proud of that for the trauma, Benji Trauma and for all the other system. Yes, we are CE marked today. We have around 650 sites using like institution, not sites. Seven more using the solution in 24 countries in total. So we have mainly Europe. We have some in Australia and us.
[00:15:17] Jennifer Callahan
Wow, that’s awesome.
[00:15:19] Cecile Hourquet
Yeah, we are really proud of that. We are in the top five of the producers.
[00:15:23] Jennifer Callahan
Okay. So is there other companies out there that have software like this that would be a competitor of yours?
[00:15:29] Cecile Hourquet
Of course, like everywhere, you have to understand that with AI you have a lot of competition and so that that we are really proud to have such a number because it’s also, I would say, like a recognition of the hard work that we put to have good algorithm. You also have a lot of different specialties in the world, so you have like a mammography or other like ways.
[00:15:49] Jennifer Callahan
So I have to say I feel kind of silly that I never even thought that this type of software out there existed. I hate to say it, never even thought of it. So when I was told that I would be interviewing yourself, I was like, as I said, I went on to the website and was like, Wow, this is amazing, this type of software. So the software that currently is. Does the bone trauma and then the chest view. Am I missing anything that has been developed and maybe isn’t as popular as those two programs? Or is there anything that Lemur is currently working on?
[00:16:19] Cecile Hourquet
We have two algorithms which have been released in the beginning of the year. We have the automation of the MSK measurements, so I know that it’s totally different. The way we are doing measurements in Europe versus the US is totally different. So in the US you will mainly do maybe the kabango things like that, where in France for example, we are making them all for the orthopedists and it’s actually the radiographers that will make the measurements, the MSK measurements. So it’s totally different than in the US. For them it’s a big thing because they can optimize it and into the workflow like experiences went better. And we also have the automation of the Bone Age assessment. So like using the Coalition pilot method, right?
[00:17:03] Jennifer Callahan
And that’s where anyone who’s listening that isn’t familiar with Bone Age generally, it’s a study for more. I’d say mostly children that are growing in relation to what their age is. So you do a radiograph of the left hand and then it’s looked at in terms of bone age. I feel like a lot of times it has to do with either children who are prematurely going through puberty and or just seem to be a little bit smaller or are way under their height measurements of where they should be age wise.
[00:17:34] Cecile Hourquet
You get it. The most powerful time is to see if there is a delay, for example, of the puberty, but also to see to know which height they would be.
[00:17:43] Jennifer Callahan
[00:17:43] Cecile Hourquet
So, for example, for kids that are making sports and are like sports or things like that, they want to know like to have a guess if they play basketball, would they be tall or not or things like that? Yeah.
[00:17:54] Jennifer Callahan
And they’re looking at guess the growth plates in the hand. Yeah.
[00:17:57] Cecile Hourquet
Yeah. So it’s an automatic detection. So basically you have a book. It’s a really old process that come from us where you have this atlas and you have to check which how looks the end comparing to the atlas. So I know that radiologists don’t like that is really long. They never know where is a book. So they are really happy that we we automate it.
[00:18:15] Jennifer Callahan
So do you have, I guess, feedback from the different sites that the software has been incorporated into? Like how has it changed their workflow at their sites?
[00:18:26] Cecile Hourquet
Well, it would depend if it’s a hospital or a clinic or a private center or even teleradiology and they don’t have the same, I would say, goal. For example, in hospital, you want to discharge, you want to do triage in a better way, right? So you want to save time, you want to save space. Also, we have a lot of feedback about how it improved the relationship between radiologists and physician because as every physician can see it, they feel more comfortable. So they feel that they can handle more things and there is less. I would say go and go. You know, when radiologists overload and they don’t answer, they feel like, okay, I already have a companion helping me to kind of guess and I can already ask for some extra for a CT scan, for example, to check things that I find the algorithm or things like that. So in general, I would say that the patient journey is faster. We have the reading time that have been proven to be shortened obviously for all type of radiologists. So we are around 35% of reading time saved for the trauma and 26% of the time. So I would say for one radio it’s nothing. But if you had to all the workflow that you are doing every day, it just let more time for the people reading that to care about the patient and to go with them and to to do extra things. So it’s interesting, you also have the benefit for the patients. So as I said in the beginning, for example, we help to detect plus 36% of pneumothorax, for example. So when you have such an urgent pattern, you want to be the fastest that you can for the patient.
[00:19:54] Jennifer Callahan
Oh, sure. So as you’ve come through this and you’ve already integrated into different institutions, have you come across any challenges with the software and the people who have been using it, like any feedback that you’ve gotten?
[00:20:07] Cecile Hourquet
Yes, I hope that I will not have issues with that, but I would say that the first challenge for us is the system we have to prove. And it’s totally normal actually. We have to prove that we are safe. For example, we need to be HIPAA compliant because it’s patient data. Basically what we do is that we anonymized harmonize with them. The patients on the image, the Dicom, we send it to the cloud and we can send it back. Sometimes we can do on site, on premise, as we say, system where it’s not going out, but it’s, it’s first of all, we have to prove that we are not going to reuse it, that it’s not even interesting if we dig deeper. But it’s been we need to first enter into the, I would say, environment of an hospital or a center. And after we have people, I think that just I don’t know if we say skeptical or afraid, it depends. It can be people that just don’t think that it will help them. They are expert. They don’t see the point. And as I told you, like they just need to see a few cases that they will. Missed. And for experts, sometimes what they say to us is that they feel like sometimes the algorithm is as good than them, but they are really caring also about the residents that they have in their team or all the people that don’t have their level. And they feel more confident to say, okay, they are also coming less to me. So it’s just that we always say like, you have to try it and you will see. You will see that it’s not here to replace. Sometimes you will say the AI is wrong, but it will just help you every.
[00:21:36] Jennifer Callahan
Does your company allow that? Would you do like a trial run for them to try it out to see if it’s something that is. Yeah.
[00:21:42] Cecile Hourquet
Yeah, they can try it. We know that with everything that is not physical, you have to understand what it is. You have to understand how does it works and how it can helps you. So it’s totally normal to do this type of thing.
[00:21:55] Jennifer Callahan
All right, so let’s lean into the radiographers so people guess as myself and how I said I wasn’t even aware of this software out there, which I feel really kind of silly about. Do you think it’s something that maybe the radiographer should be made aware of that are within these institutions and or systems that is using technology like this?
[00:22:14] Cecile Hourquet
It’s true that if you look at the AI available, some radiographer have been left behind a little bit because your daily job does need necessarily. So there is AI that detect, for example, the left and the right for you or that would tell you the quality, for example. But I would say that what can be interesting in all case is that when you have a patient in front of you, if you wait one minute because it’s not more than one minute, so have the results. The patient is in front of you and you receive the results. Just the fact that you have the patient with you that you can go to the patient say, is the pain here? It can be so helpful. We will all win time. And also I think that the radiographic maybe thinks, okay, I’m going to take a second thing, like just the patient is here, so maybe I need to take a second acquisition to be sure. And like that the radiologist will have everything. What I like about AI is that it’s giving more responsibility or maybe like a better way. And there is this study showing about AI teach people and about it makes them even better to detect or to make informed decision. Maybe you need to ask to your boss if you’re interesting about, Oh, if I had it, we will all save time and I will have maybe more responsibility would be nice for me. But like how we can give a little bit more. For example, in UK, at the NHS, some radiographer have to be some reports they have a part of the reporting. So it can be interesting to to move forward to a job which is also about being with a patient.
[00:23:38] Jennifer Callahan
Right? I mean it’s definitely helpful. I could see it being helpful for Radiographers in say like the emergency room situation because patient might have pain at their knee, but maybe it’s actually radiating down from their leg a lot of times. I mean, I’ve experienced it that you might try to keep your collimation open. You know, I know that we’re taught as radiographers, you know, to calm it down, keep it nice and tight because of radiation. But sometimes if you know that their pain is somewhere more in the middle of the leg, you might keep it open. Then all of a sudden you might see a fracture there. So it reduces time, like you said, because you’re going to pick patients up, bring them back and you’re doing multiple studies. So to have that information available is definitely, definitely. So would you find that it’s kind of user friendly, this technology for the radiologists that are using it?
[00:24:25] Cecile Hourquet
So as I told you, it’s totally fully integrated into the packs so we don’t have to have another things. You just have an access and you can check and it’s maybe one minute or if your system is really, really slow. Three minutes maximum. Okay. So it depends maybe of the I would say manufacturer that you will use. But for example, us, we really try to have a design. It’s really one of our differentiating factor. We have a head of design on board, for example, which is working about making it clear if you’re colorblind or we really try to make something like when you open, it pops up. So it’s yellow. When it’s positive, it’s blue when it’s negative, something that will not like get you tired. And we really try to do something which is easy to understand and straightforward. And this is also for that that we are winning time is that we are pushing boundaries of design to really have the information in one look.
[00:25:13] Jennifer Callahan
All right, So the institutions or health systems that are using this, it’s used on every single radiograph that’s coming across. So any X- ray that they’re reading is automatically being used so they don’t have to stop and enter the patient’s information and the accession number and, you know, and then send it over to somewhere different. It’s just automatically fully automated. Yeah, that’s great. That’s great. I mean, it’s a time saver right there. Definitely. Probably a selling point in terms of the software because to do an extra step such as what I just said, I mean, it might only take like 30s, but if you have a large workflow that you’re already looking at, that extra 30s is actually a big deal. I hate to say it.
[00:25:53] Cecile Hourquet
Yeah, yeah. It’s what we had a lot of for the physicians. So for example, they told us like the fact that you are just into my workflow, it’s making the differences because I have no time. I don’t want to connect to check in the platform. I don’t want to have to send anything. This is really where we. We had to be like to find the best way to integrate without being an extra tool or things like that. It’s exactly what we don’t want anymore. We don’t want, you know, extra and extra app or things like that.
[00:26:18] Jennifer Callahan
Yes. As we were saying that there’s a few other companies that are within the United States and possibly over in Europe as well that are developing AI software like this. What would you say makes Gleamer stand apart from them?
[00:26:32] Cecile Hourquet
As I told you, FDA approval is hard to get and we were for the fracture is we are the only one for kids and and adults. So this type of thing made the differences. But I will say that for us where we get the differentiation is definitely on the clinical studies. Today we have nine papers. All of them are in really nice peer group. We had two papers in radiology and it’s really where we always work really hard is to prove what we do, but also with really strict design. Like for that, I told you just before, like We win the Margolis And we were working with a Boston University where Hi professor was that were really strict. But it’s I think it’s a design of the study that also plays is that we really work on that. Okay. You think that we have a data set where we should 5050. Okay, let’s do it with data of 70 center for three months and do it with a natural one. We find the same. So we always bring back advents like showing what we say is true. And it’s also good for us because we improve in the same time. And when you when you know better your algorithm and we are well known for that, I would say for for the design, as I told you before, and for the scientific validation also for the quality, because we can do hard scientific validation if it was to prove that the algorithm was not really efficient, it would not be a success. But it’s also because we say that we want the level of an expert, like we want an AI to be like Medical Grade II. And it’s exactly for that, that we did it to prove it.
[00:27:57] Jennifer Callahan
So we’re basically talking mostly that this is being used on chest x rays and different bone views. You know, in terms of like x rays. Do you think that the company might go more towards into a direction in the future of branching out into CT scans and MRIs?
[00:28:13] Cecile Hourquet
Yes, we will. So I can say it officially. We just signed on some fundraising, so we need it, of course, to to pass this step, to do to do it. But we are we are almost done, I would say, with the x ray family. So it was really a lot of expertise and we are really happy with what we did. But we are going to continue in mammography first because it’s something that was important for us to Unfortunately, I think that women health is sometimes a little bit left behind, even if there is good AI into that. But it was something that I’m really thrilled to to go to and we are going to do oncology for scan. It will be the first step.
[00:28:47] Jennifer Callahan
Okay. I’m sure that you’ll have to get the FDA approval for mammography, I.
[00:28:52] Jennifer Callahan
Assume, as usual, and I’m.
[00:28:53] Jennifer Callahan
Sure I would think that the approval for that would be pretty stringent just because of the prevalence of breast cancer at this point. You know, that’s being detected. So that’s great. I mean, to have an additional advancement in the field like that, I mean, there’s already been so many different strides with the technology and the imaging and the digitalization to have something. Another tool like that is is fantastic.
[00:29:16] Cecile Hourquet
I hope so. It’s a big challenge because I told you we are not releasing any algorithm, which is not at a good level. So it’s a challenge for us. But I’m pretty I’m pretty confident. Well, this was great.
[00:29:28] Jennifer Callahan
Thank you so much, Cecile, for taking your time to be with me today. I actually hope that we speak in the future when the other programs are released or to talk about what’s going on with Gleamer as time goes on, because you guys sound like a fantastic company. I mean.
[00:29:41] Cecile Hourquet
Just thank you so much. You know, it’s full of like really inspirational individuals. Like it’s a crazy adventure for us and we are really proud of where we are. And I hope that next time you will heard too much about Gleamer and that all the radiographer over there will be like, Oh, I know a lot of AI now, but yeah, I’m pretty confident in the future. Let’s get another meeting in one year maybe.
[00:30:05] Jennifer Callahan
Yeah, that’d be awesome. Thanks for being with myself and Cecile today as we discussed artificial intelligence in the world of radiology, it was definitely a great conversation to have with you. Thank you so much, Cecile, for taking the time today.
[00:30:19] Cecile Hourquet
[00:30:25] Jennifer Callahan
You’ve been listening to the skeleton crew brought to you by X-ray technician Schools.com. Join us on the next episode to explore the present and the future of the rad tech career and the field of radiology.