false
Catalog
Conversation with the Experts — Session #2 - 2022 ...
Session
Session
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
»» Welcome back. We hope you had a great lunch and you're having a wonderful day. We'd like to welcome you back to Sessions 3 and 4 for conversations with our experts. So we have a completely different set of topics today. And we're going to be talking about some cool future enhancements from NCDR. We'll be talking with John and Miriam about that. Creating a QI process to ensure quality data. Using the QI process for, using a QI project for overall composite score improvement. And using data to improve quality and performance. And then the last half of our session today is going to go just roll into the full hour. We're not going to stop in between. And so Session 4, we'll be talking about leveraging your dashboard for process improvement and having quality data for the accreditation process. We have an expert on the panel for that as well. So welcome back. I'd like to once again introduce the panel. But before I do, I want to do kind of a little PSA here. There's going to be a networking event tonight. And it's called Live Welcome Networking Event. Network Under the California Sunset from 5 to 7. So I guess that means the California sunset happens between 5 and 7, I'm not sure. It's going to be on the Plaza pool deck. So make sure you put that on your calendars. If you need a reminder, it's in your app under Full Schedule and then Browse Mealtimes. But please make sure you go. Network with your friends, your colleagues, and enjoy the food and refreshments that we have for you tonight. So I'd like to actually introduce the panel. Right here to my immediate right is from ACC, Mrs. Denise Pond. I never call her Mrs. I don't know why I said that. From UHC Memorial Colorado, we have Donna Foster. From UHC Memorial Colorado, we also have Lisa Myers. Wave there, Lisa. From ACC, we have Kristen Young. ACC, we have Miriam Nunez. And from Q-Centrics, we have Shelly Pifer. And El Camino Health, Kay Jaramillo. I practice Jaramillo like 25 times in the mirror because it's spelled with a J and she said don't call me Jaramillo. So I sit at home now going Jaramillo, Jaramillo. We're happy to have you. And from ACC, we have John Durroute. And on the end there, from University of Maryland Baltimore, we have Selena Yedla. So I'm going to quit taking up their time now and turn it over to John Durroute and Miriam Nunez and they will actually be talking to you about some exciting updates we have. Thank you very much, David, and welcome back everyone who was able to join us for the first two sessions. So regarding future enhancements, there was a, and back to the participant survey, there was a recurrent theme from the feedback that we received in the most recent cycle. And that was there's a little too much time spent going to different resources to find answers to questions. So I want to acknowledge that we do educate in several different formats. And the benefit of that being that you reach different people and people learn differently. So there's the benefit of using those different formats. But on the other flip side, sometimes that platform technology hasn't made it entirely feasible to keep everything centralized in one nice neat package. But do know that we're working on this. So to that end, feedback through the survey and just from talking to you all, we understand the value of our frequently asked questions in each registry as far as helping to clarify the intent of data elements and sort of clean up some of the gray area that inevitably exists from time to time. So we are going to be taking some steps to incorporate the data element specific FAQs into the data dictionary under each applicable data element in each registry. So what does that mean for you? Simply put, it will make it so you have one less place to look for answers to coding questions. There will be the data dictionary as it always is. And then any time that there's an FAQ that teaches to that data element, it will be listed right there in the data dictionary. Yeah, it's going to be very good. Yes, yes. And so will there still be an FAQ database? Yes, there will, but it likely will have those FAQs removed because now they're going to be in the data element in the data dictionary. So there's not really any real reason to have the duplicitous, you know, having them in both places. It will still be there though because there are some FAQs that don't actually apply to specific data elements but speak to metrics or other kinds of frequently asked questions that are not data element specific. So it will be maintained in that way. Will the data dictionary be posted more frequently with updates? Yes, absolutely. That will be sort of on our end. By nature of implementing this, we're going to have to be updating the data dictionary much more frequently and obviously making sure that you're aware of that with our announcements and all of our usual communication pathways. So and when will this be happening? Soon. We don't have an exact date yet. So I don't want to commit to anything, but it is in the works and it will be rolled out relatively soon and we'll be sure to make sure we communicate as soon as we know a little bit more. Miriam. Hello, everyone. Hello, everyone. It's great to be part of this panel today. I want to go ahead and talk to you and share some information about the contactless feature. We do have some enhancements coming up. But first for those that don't know what the contact feature is, it has been the new, well, the newest NCDR communication method that can be used for you to submit your inquiry. You can, it's very simple, all you have to do is log in into the registry. On the left-hand side, you'll see where it has the contactless link. Once you click on that, all you have to do is enter a subject line, put in your inquiry and it will send you an email with the case number. Once you receive that case number, it can be used for you to refer to if you need to follow up with the cases or if you have any question about the case. But we have heard some feedback from you on regards to some updates that you would like to see and also what more you would like to be part of this new communication method. And we, I just want to let you know that we have heard you. We actually are working on those enhancements and if everything goes as planned, within, maybe within the next few months, you will be able to actually go into these cases and filter the cases. You'll be able to actually add comments and there's not a limit to comments. You can go in and add as many comments as you like. If there's an open case, you can actually close the case. So for example, if you decide that you no longer need assistance with the case, you can just go in and say, it should resolve or figure out the issue and close your own case. Also, most importantly, you will be able to add attachments. So any attachments, if you want to send screenshots and so on, you can go ahead and attach it to the case as well. We want to encourage everyone to go ahead and use the Contact Us feature as it is going to be the future of NCDR communication and it's a great tool. It's very user-friendly and again, you know, if you have any other questions in regards to the Contact Us feature and so on, you can go ahead and reach out to most of us here at ACC. We're knowledgeable with it. So those are the updates that I have for the Contact Us. Do you have any questions? »» No. We do know that the IRR process is a very important process to a lot of this. So Shell, can you tell us how you do that with your company? »» Oh, sure. Sure. I can do that. So at Q-Centrics, we have a very, we've committed ourselves to a very intense IRR process so that we can assure quality data to all of our partners. And so we decided long ago to sort of track ourselves the same way and sort of imitate what NCDR does with their IRR process as far as using it not in a punitive nature at all, but in a very educational manner. So we actually have two main parts to our IRR process. We have one for our new partners, and then we have another one for existing partners. So basically what happens is when a new partner comes on board, we assign a team of abstractors to that partner, and each abstractor then must complete five cases in a row with a 97% or better match rate. And even if they get to case number 5 and they're below 97%, they have to start all over again. And we don't do that in a punitive nature. We do it so that we have the opportunity to really spend time with each abstractor and educate them, especially where we see them sort of trending in various data elements that might be too difficult for them to understand at the time. So we spend a little extra time with them and move them along in that process. We re-abstract every single case for these new teams. And we have clinical data leads that do that. And they will send the case back to the original abstractor along with some feedback on the mismatched data elements and where the information was found and why it was a mismatch. And that just leads us to opportunities for education. So once the abstractor has performed five cases in a row over 97% or better, then we move that abstractor into just doing their cases on their own and not having to send them to a clinical data lead. And then we have another process for our existing partners. And that is called our Monthly Ongoing IRR. So our company uses an algorithm and it pulls randomly these cases for re-abstraction. And again, our clinical data leads re-abstract the cases from beginning to end and send the information for any mismatched data elements back to the original abstractor. It doesn't mean that they have to have a 97% or better, although that's our goal. So if we find that there is someone trending below 97%, we take those cases and we look at, okay, is there one area that they're having difficulty in or is it just all over the place? So, you know, if it's all over the place, then we might have to place that particular person on what we call a focused review, whereby they then begin the green-lighting process all over again. But if it's dealing with a specific data element or a section of the abstraction, then we really dive into that with them. I find that, you know, this really supports the overall process. And whether you're a single hospital or a hospital system or a company like ours, you know, we always strive to have consistent quality data. Because you all know what you put in, you get out. So you know, we want to assure that our work is at the highest quality level as possible. So Donna, since you're the only CPMI abstractor at your hospital, how do you incorporate the IRR process and how do you share that information with your hospital system? »» Sure. Well, historically, yes, you are correct. I have been the only chest pain MI abstractor since 2012, I believe. And so, you know, you would have liked to have got somebody within the department to actually take over a few charts and look at it and validate what I've done. But they're inundated with their own work. And so I had to be kind of creative with how I did it. And so what I do is I developed a spreadsheet that has every single patient on it and I divide it quarterly. And then on the last tab that I create is my fallout tab. And so each one of those patients are on there with the valid or impertinent data points that I think need to be seen and what are actually measured. And like I said, when I go through the TQR process and look at the dashboard when it comes back, I highlight those things that I found were falling out. And I actually project what I think is going to fall out as I'm abstracting it. So when I get that back and it's not coming up as a fallout, I question, what did I not understand? Why isn't it coming up? And so I'll go ahead and look back at the companion guide and look at the exceptions and the exclusions and oh, so let's say if it was a risk score, it wasn't on there, I projected it to fall out for that. And well, when I look at it, oh, it's an unstable angina patient. That's why it didn't fall out. So I do that process. Once I'm done with it, though, I send it on over to Lisa. We load it up into our teams and she's able to go through it and look at it. And it brings up some quite interesting conversations between the two of us as she challenges my knowledge of the definitions. And sometimes, like John was saying, there is some gray areas. And so that prompts me to send in an FAQ to get clarification. And sometimes Lisa's actually right. But about a year ago, our hospital system merged. The northern region merged with the southern region in the quality department. And so now there is actually four of us who do the chest pain MI and that has been great. We actually formed now meetings every month where we get together and we downloaded the IRR process or the audit tool out of the NCDR resource tab. And we'll select a patient that we thought might be difficult, at least that's what I do. I take a patient that I thought there was a lot going on with that patient and I just I enjoy the validity of knowing I did it right. So if there's any question in my mind that I didn't do it right, then that would be a patient I would select. And all of us would abstract it prior to getting together. And you know, it's been great. We go over it. We 99.9% of the time agree with it. Every once in a while there is something that comes up. You know, all of you who have been doing this for a while realize you kind of get in this little box that you think that definition is. You've read it a hundred times and you think you know it. And it brings up a lot of insightful conversation. And this particular patient, I'll give an example, was seen in the emergency room. And the MD or the APP did not document hyperlipidemia. However, we noticed that the patient actually had a lab value that was done a couple months prior to their visit. And those lab values actually laid within the definition that was stated. We didn't all disagree that we could do that. You know, as a data specialist abstractor, you don't want to be the one to make a clinical diagnosis. You know, it's just not something you want to do. And it has to be stated in your mind in black and white. And so we sent in an FAQ and thank you. We got it back very quickly. I think within a few hours we got our response. And it validated that if in fact the values from the blood work fall within the parameters stated in the definition, that yes, you could in fact state yes to hyperlipidemia or dyslipidemia. So you know, it brings on a lot of very insightful conversations. And we learn a lot from each other. So I've been very thankful to have that process finally. Can I ask all the panelists? So I'm curious, can you give me an example of how you use QI projects to improve measures and metrics? And Lisa, I know you've got a good example. »» Absolutely. We started noticing in our metrics that some of our doctors were not documenting all the medications that were proper for guideline therapy. So as we started looking at this, we tried to figure out how can we get this metric up and make it better. Between the cardiologists, the APPs, all the intensivists and everyone, we saw that this was an issue. So what did we do? We got together. We talked to everybody about it. And then we figured out the easiest way to deal with it is making a cheat sheet, so to say. So we made badge buddies. And with our badge buddies that we did, we put on there everything that's required for discharge. So now you have a badge buddy that they can wear. And I made it specific for physicians, as well as I actually gave them to the nurses. They're discharging the patient. A nurse is perfectly capable of saying, whoa, dude, you didn't order this, what do we want to do? And call them before the patient goes home and we have a discharge. So with the badge buddies, we noticed that our numbers were getting much better. We were seeing that the doctors were actually documenting all the meds or actually making a note in the chart if they didn't give a med. So we saw our metrics go up considerably and it helped with everything across the board. Not to mention, it goes back to what we were saying in sessions one and two, it's what's best for the patient. And you know, in the end, that's what this all comes down to is what's best for the patient. Felina? Yeah, I can speak to what we do at our hospital. As you all know, there's a growing recognition of risk of contrast-associated kidney injury to patients undergoing coronary interventions in the cath lab. As well as there are strategies that can actually help, that can be employed to limit the damage to those at the most risk. So what we did, in the past intervention, cardiologists did not have such good awareness. You know, they were more concentrated on treating the coronaries or the heart. But today, I think all of the cardiologists have a very good knowledge of this risk to the kidneys and the poor outcomes. So we are performing preventive strategies. As we know that AKI increases the length of stay, increase 30-day readmission and poor clinical outcomes. So we really concentrated on minimizing our contrast administration and intravenous fluid hydration. Again, in more challenging settings and acute settings, we don't have much hold of it. You know, we can't say if a patient especially is coming with an acute MI, we can't say we have to have an hour to hydrate the patient when the patient just needs to go to the cath lab right away. But for our other stable or elected cases, you know, we kind of be very conscious of what our choice of contrast media. We have minimizing the contrast media, especially for the ones that need to have CTs done prior to going to the cath lab. You know, it's again under the clinical suspicion of any aortic dissection. The ED physicians and the intervention cardiologists are constantly having these conversations till the patient is taken to the cath lab, quantifying the maximum safe volume that's been proposed. We also try to hold our non-essential nephrotoxic drugs 24 hours prior to the procedures. And at University of Maryland Baltimore Medical Center, we discuss our outcome metrics. And it's revealed our risk-adjusted AKI for quarter three 2020 was 5.72%. And we also saw that our IV hydration rate was only 11%. So we really took that into consideration and we didn't have a standardized process and most of our providers were not, you know, taking the ownership of pre-procedural hydration orders. So then we collaborated with IT team and clinical informatics team and we developed a pre-procedural syn avoidance IV hydration order set with this hard stop to order hydration or indicate the reason not ordered. So this really helped us. And after our EMR had the order set implementation, we went back again six week retrospective data collection and we saw that post implementation, our AKI rate has gone down to 1.9% and our pre-procedural IV hydration orders increased from 11% to 55%. So that really worked out and it was a good QI project that we worked on. Great. So at El Camino Hospital, again in the Heart and Vascular Institute, we use QI all the time. It's our purpose and it's our focus. And really in our institution, the process starts with making sure that the data is reliable. And that's really thanks to our amazing abstractors. It's also because of our concurrent review process that we have with our medical director. We have it every other Friday. And when it's payday Friday, it's really awesome. But that meeting is so good because that's where we can identify the cases and some of the clinical pearls that we really need to share with our other providers at our quarterly multidisciplinary meeting. And it's just amazing because I'm lucky to have a medical director who really loves to teach. At our hospital, we don't have residents and we don't have fellows. So our medical director relies on the nurses and then our data specialists to create the slide decks and pull out the CINIs and basically pull the case conference together. And put on that series of education, which is so important because when we're trying to change practice, for example, when NCDR had the campaign for reducing bleeds in PCI, it was because of this physician champion who gave, it was the talks, you know, about bleeding that really showed and influenced other physicians to change their practice, to adopt a radial first approach, to go back to the basics of using weight-based heparin and reserving the 2B3A inhibitors for other things. And, oh, the basics of just femoral access, using the ultrasound guidance and using fluoroscopy to visualize the top of the femoral head and micropuncture needle. Just his commitment to those talks really helped engage all the providers. We went from having, you know, one radialist to all of our interventionalists going radial. And so in addition to having a physician champion, another QI we had was around door to EKG times in the ER. During the pandemic, our ER spent quite a bit of time figuring out the triage and the pathway workflows for suspected COVID and non-COVID. And by showing that clinical detail line or that detailed line in a chest pain MI report, our clinical manager just took to task and really worked with the EKG department and his whole team to make sure that every single one of our chest pain patients who was walking in with possible ACS, that they were getting that door to EKG in less than 10 minutes. I think during the COVID, like in April of 2020, I think it was around 30% showed that graph. And since then, they've just been every month and now it's sustaining at 75% of all ED chest pain walk-in patients getting an EKG within 10. And so, yeah, it's really important to have a physician champion and to have, so I guess my takeaway is it's about reliable data and then really working on the relationships with the department to make QI happen. So true, so true. Well, Selena and Kay and Lisa, you've all given really good examples of QI projects and I really appreciate hearing all of it. I kinda wanna switch gears on you though and maybe put you all to a little bit of a test. So I'm gonna describe a metric that we improved with our partners in the form of a question. So this happens to be an example of a class one ACC recommendation. At last year's quality summit, Dr. Jerry Blackwell, I believe was his name, did a wonderful presentation on this topic and he spoke about its benefits, that it's been shown to reduce symptoms and improve overall patient's quality of life and sense of well-being and furthermore, it's a revenue center. Following PCI or MI, it does have the least amount of side effects, a higher risk reduction than all guideline-directed medical therapy and yet, it is far less utilized than guideline-directed medical therapy. So does anybody out there wanna shout out what it is? Yes! You got it! So we've worked with lots and lots of partners to improve that metric and furthermore, what it's done is, they focused so much on the discharge medications and other metrics that lead into your composite scores but having improved the cardiac rehab really did help improve overall composite scores. So you have your new CAF PCI quality of care composite and you've got your overall composite scores for performance in CPMI and that really led to improvement in those scores as well. But I think the point that I wanna get across is that it really did improve overall sense of well-being with the patient and furthermore, after that, it's also proven to reduce hospital readmissions. So yay for cardiac rehab referrals. So, with that said, Donna, is there anything more you wanna add? No? Okay. So we have a couple questions in the queue and then we'll move on to leveraging your dashboard for process improvement. One thing that a couple people have asked is about the badge buddy, Lisa, wanting you to share that information and I don't know, could someone from ACC just do a real brief explanation of our participant resource sharing pages so people know that Lisa can share that to the participant resource sharing pages? Absolutely, I'd be happy to share that with anybody. It's, we just, you gotta have a department that's willing to make it small but if I could get this to the ACC, I could send it to them in the Excel form even so that you could send it off to your printer shop and they'll make them. But what we did is we got them kind of laminated but like I said, it just has what's required for discharge for all of our ACS-type patients. And then what I did do on the backside of it for the nursing staff, just kind of an FYI, is I threw, if any of you are out there getting accredited, I threw EHOC on it. So you can remember to bring that in and just talk about that, okay. So those of you getting accredited get that really well. For the physicians, I threw on there some stuff to just help with documentation. And what we could do is I will get this out to the ACC when I get back into Colorado and I'll make sure that you all have the copies, you have access to those copies of it. And just so everybody knows, just so everybody knows that we do have a resource, participant resource sharing page for this type of thing exactly. So if you're not familiar with that, make sure you go to the website and you locate it and there's a lot of resources that your fellow participants has actually submitted and you can download those and use their work. So make sure you are aware of that. Let's take this moment then to move on. Kristen, you wanna start us off with leveraging your dashboard? Yeah, this is actually one of my favorite parts to speak about. Again, being a former abstractor and now working at the NCDR, I feel like I have a little niche for this. But anyway, whenever you, we're gonna take it back to process improvement, which I hold near and dear to me. Whenever you're coming in to your Tuesday, Monday or Tuesday dashboard refresh and you're getting ready to go to that meeting of the week where you're gonna look at, I don't know, a STEMI, your STEMI patient's door to balloon time, just for example. And you're coming in and you're like, this is gonna be great. I'm so excited. I'm ready for this. I'm so fresh out and bushy-tailed today. And then you look at your dashboard and you're like, ah! I kid, I kid. But what you need to do from there is first, get your zen, calm down, and a couple of steps. We're nurses, we troubleshoot for a living. Our IV pumps get the air out whenever they're beeping, all that stuff, the bells and whistles. So troubleshoot your data. The first thing that you should do is was, ask yourself is, was this a data entry error? I'm not sure if you're like me, but my mind taps faster than my fingers. So I could imagine just going along, being so used to that data collection form and data collection tool, just making a keystroke error, it happens. But if you're also working at a large facility that has more than one abstract or entering data for one registry, are you all on the same page? And I think it might have been Shelly, it might have been Donna that touched on this about the education of the data elements. So is everybody abstracting according to the coding instructions or is there some areas that you need re-education or more education on? And that's kind of like where we come into play, especially with those gray areas. But we also get those questions about, hey, so-and-so said this on my team and so-and-so would help us settle this dispute. So we're also like the ringleaders, but the, what's the word I'm looking for? Pregnancy brain, geez. And I'm not pregnant, post-pregnancy brain. So yeah, just start with like those two steps of just the basic like troubleshooting your data before you get all into a tizzy. Because what you're doing with this though essentially is that you're going into that meeting, but you'd really need to be prepared. So if it isn't a data entry error or an education error, you have to be ready to speak to that. But if it's not, then you have to be ready to speak to all the other possible variables that it could have been, right? We're all getting out of, well, some of us are still in COVID, the COVID era and taking all the precautions. But I mean, that affected everybody in the hospitals. And we've heard that from you on many levels, especially from staffing. And maybe you didn't have the staff to do that timely ECG, or maybe there wasn't enough staff to even do like the cath lab visit to get the cath lab in time or et cetera, et cetera, all of those things. But you gotta be able to speak to that of what's happening in your hospital. Because whenever you go into those meetings, the last thing that they wanna hear is, well, how did you code all these patients, right? They're high level, right? They want you to feed them what they need to know as in regards to what could have possibly happened. And then from there, that they will then take that information and say, okay, well, was this just a one-time problem? Like you have a bad day, you have a bad hair day, whatever. You can have a bad day today. Was that just a one-off? And we'll just monitor this and see what happens next time? Or are you trending it? Like on that performance trend graph you have on your hospital? Is this a longitudinal problem where now you might wanna start thinking about process improvement because you really don't wanna see the same outcome happen over and over again for every quarter? I feel like I'm rambling on, but am I missing anything? Did I? Yes, you are, Kristen. Oh, thanks. It's not just that the abstractors could have made a data entry error. Did your physicians fill out all the information that was needed for each patient? Were all their comorbidities coded? Did they use the smart phrases that you've worked hours and hours to put in place for them? So that's another area that you really do need to check on. And I know when you go to see each physician and you say, hey, you didn't check this box. They're like, what? No, that's your responsibility. No, it's their responsibility. That's part of capturing all the data in the EHR that they need to capture. But I'm gonna talk a little bit about reimbursement and how you can use your metrics for reimbursement. So there are facilities that contract third-party payers who will pay more the better your metrics are. So if you can get your metrics up high consistently, they will reimburse you at a higher level. The other thing you can do to bring in more money into your facility is bring more patients into your beds. When you go to sites like Hospital to Home, something like that, I don't remember the exact name of it, but they go in there and they check out your hospital's rating, what registries you're involved in, and they see what your metrics look like. Even if you have a year that's off, they'll be able to see the next year, yes, that facility is really working hard and they brought their metric results up. They're a four-star now. And that is what's going to bring more patients to your facilities and will bring in more revenue into your hospitals. And then just to go back to talking about making corrections to results, I think Kristen Schriek really spoke volumes before that inevitably you do your diligent, best work abstracting the data and submitting it and you anticipate a certain result. You check the dashboard and it's not the result you were anticipating or expecting. So was it because of a keystroke error? Was it because of misinterpretation of the data element that you found out after the fact? Was it missing documentation that's actually now available and supports coding a different way? So whatever it is, is there a way to correct the results? Absolutely, and I think Kristen actually touched on this in one of our earlier sessions. But yes, you're strongly encouraged to correct the results. Absolutely, it should be correct. You want everything to be reflected according to what actually happened. You know, that's part of the business. So, you know, you can go ahead and submit. Now, if you are expecting the results then to show up on the dashboard with the corrected data, then, you know, because of the rolling for the nature of the rolling four quarters, it may not make sense to go back further than the current rolling four quarter time frame. Because the aggregation only aggregates the rolling four quarters. So, if you're expecting to see that result on the dashboard, you may just want to kind of focus on those current rolling four quarters when you're making corrections to your data. Well, I do want to add something to that, John. If you are one of the folks that participates in either the LAAO or the TVT registry, those are CMS mandated registries. So, if you have missed entering a patient into the year prior to the rolling four quarters, you're going to want to get that patient in. Because that data will be put into our data warehouse. And in the event that CMS comes to audit, you will have proof that that patient has an NCDR number and that they have been put in the registry per their requirements. So, that is the only time it's beneficial for you to enter in data that is not going to be reflected on the dashboard. Okay? Yes, I know. So, accreditation and data. I know, I think Lisa's accredited. Lisa's the one from the accredited. Do you find that outcome metrics are helpful in the accreditation process? And how did QI projects aid in that process? They're absolutely valuable when you're doing your accreditation process. Your metrics are going to help guide you finding a PI project. They're also going to help you finding how you're going to fix your processes and make your hospital better at the cardiac program. You're not going to pass an ACC accreditation if your metrics are in the toilet. So, you really do need to look at the metrics and especially with the chest pain MI. Chest pain MI rolls into your ACC. So, they're kind of like married. So, you want to make sure that things are looking good. You want to check your data. You want to question your data. That's where you're going to find a lot of your PI projects as well. You're going to look at things and you're going to be like, well, why can't my doctors document a risk score? I mean, come on, how easy is it? All you got to do is put a Timmy in the chart. So, this is where you're going to find these things and this is where you're going to start working on that PI project and where you're going to put the pieces together. You kind of need that data because that data is going to help guide it and without it you're just you're answering questions, but it helps you it helps you look at the whole picture and you're looking at the whole system, your whole cardiac process as well as your whole program. Lisa, I really wasn't involved in the accreditation process with you. Can you tell us what types of information wasn't available through the registries and how did you supplement for that? I'm going to use our STEMI program as a kind of an example. Everybody who's in this room knows how important the STEMI program is at your hospital and you also know that real-time data is very important for that program. So what does that mean? That means that none of us, and I know I'm speaking openly here probably, but nobody waits the 30 days until something's in the NCDR. You're not going to wait that long to find out if you fell out somewhere. We all have our dashboards. We all have our stuff. So that's the one area with the NCDR that's kind of, you know, it kind of makes you go back and you have to think because we make up our own stuff so that we can, you know, our own dashboard so that we could get that real-time data. For us, looking at the real-time data became very important because we were looking at our cancellation rate. We noticed a bit, a slight discrepancy in our cancellation rate via an outside agency app that we use, Pulsera, versus what the NCDR was saying for us. Well, when we come to find out, NCDR data is 30 days later and it's as the patient is coded. If you're an abstractor and you're not coding every patient that was a STEMI call and you're only coding the patients that came as a STEMI, you're going to have a great STEMI outcome. You're going to be 100%, man. No cancellations. If you're looking at everybody in real-time and a lot of these patients who are canceled and when we looked at our Pulsera data, our data wasn't that good because we're looking at every patient that was called a STEMI alert, including the patient who may have had a seizure or just may have had a cough or all that. So those all played in. So that was the one area that we ended up, you know, having to look at. What did we do for that? Well, now our STEMI abstractor person for both registries, especially for our ACC registry, gets a list of every single cardiac alert that's called monthly. So she's adding them into that ACC site. And what's happening is we're starting to see the correlation. And, you know, good news is, is our numbers are going down. So, you know, all our projects are working. But that's one of the ways that we've been able to, you know, work with the metrics that are in the, you know, ACC versus the NCDR and with the real-time metrics. David, do you have anything to add? I do not have anything to add, but I do have some questions. So it looks like people want to know, actually, I think this is to Shelly. With the sister hospitals that have merged abstractor groups, how long did it take to get all abstractors on the quote unquote same page at 97 percent matching? So I guess I'd have to answer that in that it's per partner that we do the IRR process. This is what we're talking about, the IRR process of metric. They want to know how long it took for the team to get to that 97% metric? I believe that's the question, yes. Yeah. Okay. You know, that's really, I guess it's a little bit difficult to answer only because each team is different. So I would say on average, it's about a week to two week process because we expect our abstractors to do those cases pretty fast. Right. So we're not looking for them to do one case a week. We're looking for them to do five cases, send them to your clinical data lead. And that clinical data lead has a 24 hour turnaround time and sends them back. So 24 to 48 hours, we have a little bit of lenience there, but they're expected to do them within a 24 to 48 hour period. And then by the time the data lead gets them back, it's about a week, sometimes two if it turns out to be a partner that might have two different EMRs. We have some partners that use two and three different EMRs. So, you know, just navigating those EMRs can take a team a little bit of time to build up to the point where they know, okay, for this particular data element, I need to go into this EMR. And for this data element, I need to go to this EMR. So it's just getting that sort of process honed in. And we really do rely a lot on our data leads to teach the teams how to hone in on that. So I hope I've answered that question. I think you did. Absolutely. And they said, do the abstractors have access to all the medical records? There's times when all the records may be scanned into the media tab or other hospital records are available in ePCI. Do abstractors look at all of these? All of them. And then I think for Lisa, door to EKG times for, quote, unquote, all chest pain patients, what are you using to create that report? What are we using to create that report, Donna? We're using, we use a little bit of everybody and we do some manual on it as well. Truthfully, we use what's in the NCDR. And then we also looking at, we do a manual on all our STEMI patients and patients that come in. I actually even pull a report out of Epic on all patients that enter the ED and have a door to EKG. And then we filter through that looking for ACS symptoms. And we start, we kind of look at that. And then I send it off to our EKG partners who go through some of these patients where our outliers are. Fantastic. Selina, your AKI prevention results were impressive, 5.72% to 1.9%. Well done. Thank you. Yeah. Did I hear that you accomplished that in six weeks? And have you considered writing a paper? No. Finally, one more comment. Finally, has University of Maryland fully adopted these practices? And are there any barriers or pushbacks from the physicians or CCL staff? No. I must say I'm really fortunate working with our team. They're very focused with patient-oriented and better outcomes for them. So our nurse practitioners really work very close with our interventional cardiologists to make sure they're hospitalists and, you know, when the patients are moved to the facility, that they are monitored to make sure the IV hydration is given even post-procedures. So I've never had pushback with our physicians. As I said, I'm very fortunate to work with interventional cardiologists that are always looking forward to see how we can do better with our patient outcomes. And they're more than willing to, you know, to go with any process change or come up with any new ideas or strategies that can help improve our patient outcomes. Very good. Thank you. Donna, you said earlier that you had a question that somebody really wanted to talk to you about. Do you want to repeat that question and answer that? I think someone had asked how I went in and found that missing data in the DQR. And so really you just click on the hyperlink that says Pass. And when you're in there, there's another link where it says Failure. If you click on that, you can change it to Missing, and then it will bring you up all your missing data in there. And then you can go and figure out what's actually missing. Of course, you expect some of that stuff to be missing. Like I don't know how many of you actually get an AMR or rather an EMS trip sheet on a patient who was a STEMI patient to the first facility who then gets transferred to me for the primary PCI. I'm always going to have those trip sheet area blank because I rarely, if ever, like never get that information. So you expect that. But then you're going to find all of those other missing data elements that you fat fingered through because you were going too quickly. Denise, being in the registries for as many years as you have, can you give one piece of wisdom that you would pass on to our participant community in terms of what you find the most valuable thing that you've learned in interacting with them? I think one of the most valuable things that I've learned in interacting with all of you is that really reading all of our materials, the data dictionary, the definitions, your suggestion to incorporate the FAQs, which we're doing, which will make our data dictionary more robust. I mean, we've already incorporated a lot of FAQs into the updated EPDI registry data dictionary and we have had a lot of positive feedback about that. And just over the years how we've increased our educational offerings and just the feedback that we've gotten from all of you, how housing all the case scenarios and having them recorded and then having just educational sessions on guidelines and interpreting those, how useful those are. And so if you're new, I would encourage you to just look into our resources section and select education and just see what we have to offer. And then we're always happy to help you. I always enjoy talking to you. Thanks, Denise. For the participants, when you have a new employee or somebody that's transferred into your department, how do you remedy the barriers that you face in regards to the different way people learn, whether it's by video, whether it's by reading the data dictionary, whether it's by attending the RSM calls? Have you found barriers that you've been able to overcome in terms of helping to train your staff? Anyone of you. Well, I think in my department, wow, our retention rate for abstractors is so low. We haven't had to onboard somebody in quite some time, except for recently. And because it's one person and I'm in a smaller facility, it's really about teamwork. It's working one-on-one with that person and giving them the resources that they need, doing practice sessions with them. It's like see one, do one, until they can do it on their own. We're actually in the process of that. Literally our data abstractors started on Monday. I want to just echo the fact that in our hospital, all of our data abstractors have been in my department for probably a good ten years on average. I don't know of anybody that we've had to fill recently. That's hard to answer that question. I know I would sit with them and go over each data definition. And while you're sitting with them, it cues you into how they are actually learning. And a lot of them are multi-different ways of reading in. Yeah, it's really important. I just want to point out to everybody, too, to make sure that even if you haven't trained somebody in a long time, we do have resources that you may not even know to go to. With our Start Here link, there are steps. Some of them we've displayed in our walk-in slides, steps for a new person to get started. And even if you're not a new person, if you didn't get training when you started the role, it's beneficial to go into the materials that we have. And, of course, what we really want everyone to do is really get used to going to our QII Learning Center, where we're trying to explore and create more opportunities for people to learn, get CE. It's not so new anymore, but we are definitely trying to push everyone there to help facilitate training orientation, that type of thing, with everyone. To your point, David, that QI Learning Center is great. And we sort of banked off of that by developing the QC Institute. And we have dedicated trainers there that train our abstractors. And then they go through a process for utilizing MindFlash modules. So we do many more case scenarios, and they'll read the case scenarios and then have to answer questions and that sort of thing. And then if we find that they're having difficulty, we work with them a little bit more on that. So, of course, we don't have everybody come experienced to us. So some require a little more attention than others. But the QII Learning Center is great. And I want to add to Shelley. The monthly meetings, the NCDR monthly meetings, are great information. And every time I attend them, those case studies, I try to present them in our collaborative monthly meetings, just so the physicians are also understanding what the definition is for each data element. So when we go back to them, they incorporate those notes in their cath reports, which is really very, very useful for us. And I just want to commend the group that's up here now, because what I've heard you say is that you're doing data abstraction in real time and that you're submitting your data often and that you are reviewing that data then on Monday or Tuesday and making sure that you've done everything you can for each one of those patients so that as many patients as can are meeting the numerator criteria. And it just shows that the great work you're doing at your facilities is showing up in your metrics. And, you know, I would encourage all of you to do the same thing as much as you can. Thank you, Denise. So I'd like to take this opportunity to thank Selena, Shelley, Donna, Lisa, and Kay. They've dedicated their time out of their busy schedules to come and help prepare for these sessions, working behind the scenes in addition to doing their daily roles. They have volunteered their time to come and talk as one of you, to you, and in the hopes that you can learn from each other. I'm really grateful that you guys participated. You did a phenomenal job today. I'd also like to thank Denise and John and Kristen and Miriam as well. They put a lot of time into, as well as the panel here, Fernando, Karen, and Yin, and put a lot of time into actually trying to create something meaningful for you today, and I also appreciate you guys coming and attending. This has been a great session, in my opinion. I appreciate all of your attendance here. We're at our time at 2.15 now. So with that, if you don't have any more questions that we could just answer quickly, we will be releasing you.
Video Summary
In this video, the panel of experts discuss various topics related to future enhancements, quality improvement processes, data utilization, and process improvement. They talk about incorporating frequently asked questions into data dictionaries, enhancements in the contact feature for communication, using QI projects to improve measures and metrics, and leveraging dashboards for process improvement. They also discuss the importance of reliable data, the value of outcome metrics in the accreditation process, and the challenges and strategies for training new staff members. The panelists provide insights, examples, and recommendations based on their experiences and expertise.<br /><br />No credits were granted in the video.
Keywords
future enhancements
quality improvement processes
data utilization
process improvement
frequently asked questions
contact feature
QI projects
dashboards
reliable data
×
Please select your language
1
English