false
Catalog
Conversation with the Experts — Session #1 - 2022 ...
Conversation with the Experts - Session #1
Conversation with the Experts - Session #1
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Good morning to all of you in the audience today. I am so thrilled to actually be back here with each and every one of you in the audience. Today is the day of FIRST, the first time we've had the summit in Los Angeles, which is phenomenal, but also the first time we've ever had a hybrid conference. We are so thrilled to see all the faces in the audience today. We'd like to extend a warm welcome to each of you who made your way to L.A. to be with us, but also a warm welcome to all of you who are virtually with us today. So we just wanted to give you all a warm welcome. I just wanted to make a very quick announcement about the Wi-Fi and the mobile app that we'll be using today. The Wi-Fi is under the Quality Summit, and the passcode for that is quality22. So if you have a chance, and if you've already not done it, please do that at this time. So you are in conversations with the experts, and this is session one, and there's not going to be a break between session one and two. We're going to just piggyback right into the session two, and it'll be the full hour. The intent today is to have conversations with our experts, and we will take questions as they come up in our queues. We have a panel on the front row who are actually fielding your questions, regardless if they're virtual or in person. So welcome. Our topics that we're going to discuss in this very first session are timeliness of NCDR communication and expectations, information that we've harvested from your feedback from our biannual survey, engagement from abstractors to department heads. As we move to the second half of the session, we're going to talk about some dashboard and metric discussions, and importance of frequent adjudication and cleaning of data. At this time, I'd like to turn to the expert panel and do some introductions, and then I'll turn it over to them. First off, I'm David Bonner. I have been with the college for 16 years now, and it's just the best place ever, and I enjoy every minute I get to spend with people like you. So right here to my immediate right is, from the ACC, we have Denise Pond. Next to Denise, from UHC Memorial in Colorado, Donna Foster. Also behind Donna, from UHC Memorial in Colorado, we have Lisa Myers. And then back down to the front, we have, from ACC, Kristen Young. And behind Kristen is Miriam Nunez from ACC. And then back to the front row, we have Shelly Peiffer, and she is representing Q-Centrics. Right behind Shelly, we have, from El Camino Heth, Kay Jaramillo. And then down in front, we have, from ACC, John Gerrout. And from the University of Maryland in Baltimore, we have Selena Yedla. So I'd like to, number one, just thank this expert panel. It's just been amazing to work with. I know they have a lot of great information to share with you. At this time, I'm going to turn it over to John, who will start us off with conversations about some of the feedback that we've gotten. So enjoy your conference. Thank you, David, and thank you all to you for joining us both virtually and in person. This is wonderful. So I'd like to begin this first session talking about timeliness of NCDR communication and expectations. I need to begin in talking about this in the context of that participant survey that David mentioned. So you may know about this, you may have partaken or participated in it before, but every two years NCDR conducts a participant survey where we reach out and we ask you to share your overall experience with NCDR. It's a great opportunity for you to share your thoughts and a number of different questions that you get asked. For us, the information is extremely valuable. We take it very seriously. It helps us to identify what's working, what we should continue to keep doing, and then also areas where we can improve. So some of the topics that we extrapolated from the survey in this most recent cycle had to do with timeliness, timeliness with respect to the NCDR avenues for communication, the process for answering questions, when to expect quarterly benchmarks for your published data, as well as timeliness of new version education, communication of IT delays, and just generally, you know, with contacting NCDR. So I'm pleased to say that the college is addressing some of these through innovation. NCDR has created a number of cross-departmental teams for the purpose of generating solutions, innovative solutions to tackle challenges, improve our overall offerings, and ultimately adding value. My team has been tasked with looking at creative ways of improving timeliness, and so looking at the feedback and input that we got from the survey and, you know, also some direct customer feedback in different ways, we've identified some pain points and some problems that we really believe are worth solving. So through that, we've generated a number of ideas, and we're currently kind of evaluating those for feasibility. And then in addition to the innovation project, we have some process upgrades and some enhancements that have recently been implemented and some that are in the works to be implemented soon that we'll touch on later on in the session, so please stay tuned for that. So back to communication. What are the NCDR avenues of communication, and what can you do on your part to ensure that you're keeping up-to-date on all the important events and developments in any given registry that you're working in? Well, so first and foremost, the Announcements page, it is our primary means of communication. It is, you know, basically anytime there's any sort of important development that we want you to know about, we're going to share it on the Announcements page, so we really strongly encourage you to visit that at least once a week to make sure that you are up on all the latest. In addition to that are bi-monthly or quarterly registry calls. They are primarily education-focused, however, there is time that is utilized in each of those calls to go ahead and discuss updates and registry-specific announcements. So you don't want to miss those, but if you do, there's meeting minutes that you can catch up on any information that was shared. Lastly, we have a News and Views publication that we would encourage you to elect to receive. It offers a more high-level view of what's going on in NCDR, registry-specific information, and it kind of puts into context all of the different developments, the day-to-day and then the high-level, so I think that's really good. So I talked about new version education, or touched upon it as a topic. Perhaps one development that we'd like to highlight is some recent improvements that we've made with new version education. We had heard through that survey from participants from you that you're feeling overwhelmed at times when new versions are launched. It's just a lot of new information, a lot of change, and it all happens at once on that launch day. So we've made a concerted effort to provide education prior to the implementation of a version upgrade. We want to share with you as much knowledge as we have and we can before that pivotal date when the version is actually launched, and we've had success with that with two recent major launches with the TBT and EPDI registries, so we're going to continue that process and that model moving forward. We do ask you to just please understand that we don't always have the answers to everything right from the get-go. So there are questions that will come up that we kind of have to resource internally before we can educate them, and that's just sort of the nature of the registry business. Yeah. John, that has been not only helpful to the participants, but it's been really helpful to us as well to get ahead on the education before a product is launched. But I want to talk a little bit about communication of IT delays. We know that there have been issues where the resolution of IT delays have caused some concern amongst the participants, and we want to let you know that we're on top of it. We're actually working on a process to develop a means to deal with those delays. Currently we are trying to contact participants in the affected facilities to let them know how their IT issue is progressing, and we'll either send you an email or give you a phone call, and that's something new we've recently started doing. But in addition to that, every Monday morning there are several teams within NCDR that have actually been evaluating the dashboard aggregation for each and every registry, so that when there is a problem that we know about it before you do, which is key. We hate when you guys call us and say, what is wrong with that dashboard today? So we want to be on top of that. So we are working to resolve those issues by Tuesday, and if by Tuesday those issues aren't resolved, we are going to provide an announcement on the registry that is affected. So we want you to know that we definitely have heard you. Just something to kind of remember, just a little insight into IT issues. They're complex. You know the kind of metrics that the ACC and NCDR puts out. They are way up and above most organizations. We provide risk-adjusted metrics, and those are the kind of metrics that are very complex, and when there's an issue, it takes a long time to resolve those issues. Once we determine the root cause, then we have to actually do more testing, and that's by the same groups of people, because we want to ensure when we provide that back to you that we're providing you a metric that works as expected. Now regarding feedback that we received on how long it takes the quarterly benchmark reports to be provided, I actually started working at the ACC 11 years ago, and it used to take up to eight weeks for you guys to get your reports, and I have to say I'm so pleased that the reports are being offered now in about four weeks, and if you actually check your dashboard about the three-week mark, you may notice that the published flag has been set prior to us notifying you that the report is ready. This is in part due to IT reorganization to streamline their workflow, as well as providing those detail lines to you in the dashboard and moving away from those PDF reports that we used to have. Yeah, Denise, those are all good points, and that's all talking about communication with you guys, but I'm going to talk about the communication or feedback we got from you when communicating with us. So as you all know, there's a wide range of avenues to get in touch with the NCDR. You can either ask the FAQ question, email us at ncdr.acc.org, use the 1-800 number. You could also send us a letter via mail, or maybe you want to fax it, but I'll say this went way, that joke went over way better than it practiced yesterday, thank you, but we won't get those last two. If you're sending it by mail or fax, you need to stop doing that, but we are excited. We kind of get that feeling that whenever you send it through these different avenues, are we really getting that question? The answer is yes, but moving forward, if you haven't heard this already, we're going to just streamline that one into the one contact us feature, and I'm not going to talk too much about the specifics of that, because Miriam here is going to talk about the details of some of the nuances that are coming to create this streamlined communication in session three, so be sure to come back and join us for that. The other thing I just want to touch base on is just about the NCDR process for answering questions itself. Whenever you do send that question to us, and the majority of us are probably mostly all nurses here, and we do that triage process, but we do that same thing with questions because a lot of you sitting in this audience as well can just be one person that was just thrown into this role to the other end of some person that's trying to figure out the statistical equations of a risk standardized metric. So just know that the questions are answered first come first serve. They start with the BSO team, then they come to the clinical quality advisor team, and then some of them can be as simple as a yes or no, code this, and some of them are a little bit more complex that require more of a data dig, especially whenever it comes to our metrics. I just want you to know that you have been heard, and I'm going to ask the participants, does that all sound good? Kay, does this all sound good? Yes, of course, it sounds lovely. Communication is key to staying engaged with NCDR, and especially right now, like with the COVID era and everything, I do want to ask maybe Shell or Donna, with so many of our abstractors working from home, how do you keep abstractors engaged and present? Me? Go ahead. Okay. Well, I've worked for Q-Centric since 2014, and so being that we're 100% remote, I guess you could say I've been involved with the COVID era since 2014, but there is a greater connection when folks are face-to-face, and so way back then, you know, I realized we needed a way to figure out how we can best pull folks together so that we can share information, and so we developed a SharePoint to share information from NCDR, meaning the FAQs, their general announcements, and some of the case scenarios from the Q2 Learning Center, which are awesome. If you haven't been on that, please do. It's great. But then as time went on and technology grew, we also created a resource training center, so our folks, for every registry that NCDR offers, we have this site that they can go to specific to that registry, and on it, on the homepage, is all of the NCDR announcements, and then we have folders that offer education tips, so we'll take a complex PCI data element or a complex CPMI data element, we'll break it down, and we'll offer some education on how best to figure out how to abstract that particular data element, and then we'll offer some case scenarios for that, just to test your knowledge, and then we also have a folder on there for all of the specification manuals. We house our monthly newsletter, which delves into every registry, and one of the interesting things we started to do a few years ago was we used an algorithm to figure out what are the top three missed data elements in our company for each registry, and so we take those top three missed elements, and we talk about them in our newsletter and offer further education on them. We also put out alerts and notifications via email, and then we house those on the resource training center. We utilize a lot of video conferencing in our field. The company, of course, being 100% remote, found that Teams works very well, so we mostly utilize Teams and some Zoom, but then most recently, I think one of the best accomplishments that the company has gone through is we've created an actual training institute called the QC Institute, so every new data abstractor coming through QCentrics is a part of this training institute, and we have dedicated trainers, and they don't leave that institute until they have proven that they are definitely able to abstract a case, and then they move on to a team. So, Donna, if I remember correctly, you're in Colorado, so how do you manage that? Thank you. Yeah, I really like that training process. I wish we had that. That's great. I want to go back to actually prior to, because I did work in the hospital, and prior to COVID, we had what was called the Rocky Mountain Regional Meeting. All the cardiac data registry data abstractors would get together up in Denver, and we would talk about the registries, the NCDR, the STS. We'd talk about the conferences, the ACC Quality Summit and the AQO, and we'd talk about all the pertinent information that was given out during those sessions, especially in regards to new versions that came out. And so that was really great, because not all of us could actually attend those conferences, as you very well know, and it was great to have that information. And we talked about FAQs that probably didn't get published, but were asked, and we wanted to share them with everybody so that they could benefit from it. We talked about case scenarios, and it was just a lot of great networking going on. And then in 2019, our hospital had to kind of downsize, or not downsize, but make room, rather, is what I'm trying to say, for more people. And so they realized that the data abstractors didn't need to be there, and so we got moved home. And probably six to seven months later is when COVID hit, and it just... I was kind of lost, because all that communication that we had was now gone. So I learned to become involved in the data registries by looking at the webpage, looking at their home sites, seeing what the information was that was given out. Tried to touch on that weekly. I would bring up the Learning Center and fill in some gaps from that. Tried to look at the FAQs when I saw an email come across and and see that it was, you know, there was new FAQs out there. I wanted to become familiar with them. And then I took on the role as the Rocky Mountain Regional Leader and I thought, you know, I want to get us all back together. So I created a newsletter and put all that information out there on that newsletter that we normally would have got during that conference or that meeting that we attended every year. And so those were some of the things that I came up with. And then, you know, I kind of lost contact with Lisa for a little bit as she got moved to the bedside and had to fill in in the COVID time. So, but then we started talking again and and and getting teams meetings going and involved into the chest pain meetings that Lisa would hold. And it was nice to hear from her, you know, key points and how we were doing well and where we needed to improve and to hear from the doctors and, you know, their frustrations. And it helped me to understand what they needed more from me. And the conversations that Lisa and I would have, you know, often she would ask me, Donna, do you need more coffee? Because I think you need to relook at that patient. So having a good relationship with your co-workers is always a great idea. So Selena K, can I ask you, how do you engage department heads and administrators to report the NCDR outcomes? Yeah, so as you all know, collaborative promotes and shares optimal processes of care and outcomes and implements quality improvement initiatives. So based on our data, outcomes, and evidence-based practices and guidelines, recognizing the importance of quality improvement, a collaborative team focused on encouraging our teams to examine our data and benchmarking reports and participate in quality improvement. As we know, robust data collection and regular monitoring helps with earlier detection of any unintended consequences. So probably you all already know that better data actually enables our practitioners and staff to be more productive and increase the effectiveness of the policies and changes in protocols. So one of the main responsibilities as the registry manager is being able to understand this data and specifically the data elements that impact our outcomes and equally the ability to effectively convey it to our stakeholders. So our collaborative team is a cardiac interventional collaborative committee. It's an open forum and we meet in person, but during COVID time we were meeting via WebEx. Now we're back again in person as well as virtually. We meet on a monthly basis to exchange our ideas, our resources, and strategies. And we all know to deliver such encompassing service requires diversity of skills. So for this reason our team approach brought in or advocated for a collaborative team bringing in multidisciplinary teams, bringing our interventional cardiologists, our cardiologists team, our nurse practitioners both from ED and cath lab, bed flow team, pharmacy, and since most of our patients access the health care system via EMS, we made sure that our EMS teams were part of our collaborative team. And as I always say, there's no substitute of being together in person in one room working on a specific problem and breaking down the barriers to the process to deliver the best care by putting every piece together, thus eliminating any communication process. We also during our meetings bring in any new businesses, any interesting cath procedures, interesting cases just to you know educate our EMS team, our progressive care teams, and our ED teams, and any system level updates. Very interesting conversations that we have every month and I'm really glad we have them. Okay, so how's your process like? Yes, it's really, that's a very similar process in terms of engaging our department heads. Really at the hospital where I work at, at El Camino in Mountain View, that's just about an hour south of San Francisco, we have a heart and vascular Institute and that's where I work out of. That's really the department where all the NCDR registry activity takes place and we have three data abstractors and they're each, we also have nurses and a medical director that partners with those abstractors. We, we're completely separate department from the quality department, however we do report out the outcomes at our quality council meetings which meet every month and what happens there is our senior and quality, chief quality officer then takes the outcomes data and takes it up to the board of directors and so in that sense that's how we use the structure that's in place in my organization to report up our outcomes you know at the higher to up the ladder and keep them engaged. I am also in a magnet designated facility so it's super important for the nurses to be able to speak to their evidence-based practice and I take a part in that by sharing the data with the unit based partnership council leaders and the department heads of the emergency department and the cath lab. It's really great because at that point it gives a chance for the nurses and the entire team to voice their perspective about how to impact the metric from you know boots on the ground like what we can change realistically in the workflow to make things better for the patient and so yes that's kind of how we do it at my facility it's a smaller community hospital but you know I think I want to ask Lisa because you have been accredited as a chest pain center or as a chest pain MI facility. I think there's a lot of differences between the cath PCI registry and the chest pain MI registry. How do you approach that with you know senior leadership? Thanks Kay. One of the big things with that is it can be frustrating I'll just tell you. You have to have your definitions right when you go to senior leadership. Think of senior leadership as people who this is going to come out bad no matter how I say it. They just don't have that understanding of the cath PCI or NCDR. Half the time they don't even know what it is and they don't even know what you do. So when you go and approach them you have to be very specific with your definition. You have to let them know that the cath PCI is a procedural type of registry. We're looking at the procedure that's happening from the patient in the cath lab. What are we doing in the cath lab? Are we doing the right procedure? Do we have the AUC, all that done? Well then when you start talking to them about the chest pain MI, we all know that the chest pain MI is based off a process. Now we have to start looking at the process. So this is when your leaders are going to look at you going well how does that affect things? They say why is the you know door to balloon time this and here but it's this over here. You have to explain that one is a door to balloon time. This is a door to PCI. We're looking at the door to PCI. We're looking at that process where one is looking at the whole entire the procedure, pardon me. Well one is looking at the first medical contact. When did EMS or somebody actually look at this patient who has a medical background? And that's going to give you a different time frame. Once you start explaining this and breaking it down into very simplistic measures and in very easy to understand statements, you're going to have the leadership on your side. They just need to understand what you are measuring and how you're measuring it and what processes you're putting into place and what procedures you're actually doing. Now your procedures are going to lead to your processes. Your processes are going to lead to your procedures. So they actually do intertwine. So you have to look at all that. So that's my advice for when you go to your leadership and trying to talk to them about that. So David. OK. Thank you so much. We do have a question that can be answered pretty quickly. I think it's important for everybody to hear and it's in regard to in the opening slide show. I think they're referring to the walk in slides. Did it say the time to submit your data for the weekly aggregation was Fridays at noon or is it midnight at 12 a.m. It's Friday at eleven fifty nine fifty nine p.m. That was changed recently and that's a very good question. So if your data is not in by eleven fifty nine p.m. on Fridays you won't have it ready for you on Monday. Thank you guys very much and I think we will just transition into some information about dashboards if this is a good time for you. Yes. Wonderful David. I wanted to start by taking a poll of the audience. How many of you know that the detail lines previously provided in the PDF format are now available on the dashboard for you weekly. OK. There's quite a few that don't know. And so I'm glad I asked that question. For those of you that don't know I want to encourage you to call us when we get back from quality summit because we love to talk to you and we would love to help you find the detail lines in the dashboard. The detail lines are aggregated weekly and they're not just a summary of volumes and patient factors. The detail lines some of them have metrics and patient drill downs associated with them. For instance if you are in the chest pain of my registry there is an accreditation a group of accreditation metrics in the detail lines and there's also ones provided specifically for state requirements. You don't have to be in that state or you don't have to be a member of accreditation in order to utilize those. Those are there for everybody and they provide information such as door to ECG time and very important for the hospitals to know that some participants also provide their their physicians and their administrators with reports. You can go into the detail lines and create your own specialized reports by exporting them to Excel, sorting them, filtering them and they're very helpful in tracking information that aren't provided in the executive summary metrics. Now speaking about weekly aggregation and as it relates to quarterly aggregation those are two separate things. So your weekly aggregation occurs when you've selected an unpublished quarter and it's looking at a rolling four quarters and as we mentioned just a few minutes ago if your data is submitted by Friday at 11 59 59 p.m. then your data will be available for you on Monday unless there is some snafu with the dashboard and then we report that out to you by Tuesday. There are certain metrics that aren't available to you for a when you have a yellow submission status and that would be your AUC metrics but all the rest of the metrics and your detail line information are available to you as long as you have a yellow or green submission. Now let's compare that with quarterly aggregation and quarterly aggregation occurs only when you've selected a published quarter and that's because these metrics such as your risk adjusted or standardized metrics or your metrics such as composite metric 40 which is in the EPDI registry they have a lot of very detailed information in them, very complex algorithms and they require a lot of data in order to provide you with very accurate information. So those are only aggregated on a quarterly basis and only available to you when you select an unpublished quarter, a published quarter, excuse me. I just want to take it back to the detail lines real quick because I just want to really celebrate since I didn't see too many hands raised about them but if you're a seasoned NCDR abstractor, data analyst, whatever, you used to have to wait for those detail lines to come out in that published PDF which Denise said in the last session used to take up to what eight weeks I think you said but now those are even aggregating on a weekly basis. So I just wanted to point that out it's a really useful tool especially if your hospital or is interested in like you know your insurance payers, your demographics, your number of procedures and things like that because that's where you go for that level of detail outside of your metrics. Expanding on the unpublished quarters for a weekly aggregation, just because I'm a visual learner, maybe you are too, I just wanted to talk about what exactly does that mean. So just as an example, let's say we were in 2020 quarter four ending time frame on the dashboard. The unpublished quarters for that rolling four quarter time frame are going to be 2022 quarter four, quarter three, quarter two, back to 2022 quarter one. So anything that you submitted by that Friday at 1159 59 p.m., did I say one too many 59s? You're going to see that rolling four quarter on a Monday for the unpublished quarter. Now we do say submit often, you can resubmit older quarters and yes you can resubmit quarters outside of that rolling four quarter time frame, but I just want to make it a point is that you're not going to see that in reflected in the rolling four quarter report. So if it was back to like 2019, yeah submit it because guess what we're gonna keep it and it's gonna be there. You're just not going to see anything updated on that 2022 quarter four ending time frame dashboard. And just a reminder like you're sitting on these metrics and detail lines is a whole slew of information at your fingertips and you really have the ability to take this and make something of them because it is essentially real-life scenarios that are happening in your institution and you're comparing to that to metrics. Now where do these metrics come from? We do not pull them out of our hat, I promise. They come from science, they come from clinical practice guidelines, I mean they come from the AUC committees that do the writing. So these are vetted science evidence-based metrics that we, I'm sorry, science evidence-based information that we just turn into metrics by the data that you collect. So you can use this as a means of measurement. The NCDR we say you know the 50th percentile it can either make or break a process improvement project for you, but we all know that our cardiologists can be very very competitive and they want to see you at a hundred percent. So while we say the 50th percentile, you know what, it's whatever you make of it and in regards to if they want to be better than the 90th, you know what, strive for that. That's a great thing to strive for. But again, it's a means for measurement against the benchmark, a means for if you need to start thinking about process improvement at your facility, there's a lot of things that you could do in regards to these metrics and even detail lines. Can you hear me? Okay. And so under, you know, what is the most direct path to understanding the metrics and why is it important to use the executive summary measures and metrics companion guide? So simply put, you have to understand what a given metric is measuring. If you're like me, you probably want to feel confident when you're going into a room and you're going to be speaking to internal staff and stakeholders, you want to feel really good and confident that you know exactly what you're talking about. Each and every metric that a registry supports on the eReports dashboard that's reported out on the dashboard is included in the executive summary measures and metrics companion guide. So, you know, it's outlined with a description. That's the what, what is being measured. Then you have numerator and denominator criteria. So the denominator being what is the population of patients or procedures that is being evaluated in this metric. The numerator can be something that you want to be in. Let's say an example of medications prescribed at discharge, like guideline medications. You want to have your patients all in the numerator for that metric. Or it could be something that you don't want. Let's say some sort of adverse event or something like that. So the numerator denominator criteria will kind of give you that idea of what we're really looking at. And then you have denominator exclusions and exceptions. So, you know, the patient population that's described in the denominator. Maybe not every single procedure or patient is appropriate to be included in there. And there's, you know, rationale behind removing that patient or procedure from the denominator to get an overall better, more precise metric result. So lastly, I just want to point out the clinical rationale and the relevant citations. I think this is often overlooked. The answer is the why. Why is NCDR, why is the registry interested in reporting out this metric? Typically, there are citations there that point to some guideline recommendations and a host of other clinical rationale that can really give you the insight into the why. Selina, your thoughts. That's very interesting. It's a lot of information for us to get together. So Donna, how often do you submit your data and what are the advantages of doing so? Yeah, yes. Personally, I like to do it weekly if I can. And if you can imagine, our hospital sees, last year, I believe, when I looked online the other day, it was close to 115,000 patients come through our ER every year. And we submit not only for the STEMI and the non-STEMI patients, but we also submit for the low-risk and the unstable angina. And that's a plethora of information that we bring into that registry. And thinking about just submitting all of that after 90 days is an insurmountable task for me to even think about doing and have it with any quality. I do it weekly and I learned this long ago when I first started data abstraction. It was in 2007. And I did it part-time and I did my clinical part-time. And in 2009, I took on the ICD registry, which is now the EP device implant registry. And then 2012, I moved to doing it full-time. And then I was asked to take on the chest pain MI, or rather the action registry then. With that said, I learned that each registry that I was doing wanted me to report the ejection fraction. And what I found, the hard way, is that not all data definitions are created the same. So the ICD registry wants you to report, if you're given a range, the low end of the ejection fraction. And the action registry at the time wanted you to report the mid-range. And as I was doing my cleaning, I can't remember really what brought it to my attention, but I realized that, oops, Donna did it all on the low end. And being type A like I am, I went back an entire year and changed every patient that I could so it would be accurate. But submitting often is a fantastic idea because if you think about it, you've just read these patients' stories. They're fresh in your head. And so you know that when you go in, like my process is I submit my, or I rather gather the information on Thursdays. I get all my data, I get it all put in, I try to finish it off by Friday. Then I submit it into the DQR. And then the following week, I'll look at it and see where did we fall out. I then put this all on a spreadsheet that I send over to my coworker in crime here and let her kind of validate it and look through it as well. But it allows, when I'm looking at that data, that dashboard, and I'm like, there's a beta blocker fallout? I know that we didn't have a beta blocker fallout. So it makes it quick and easy because you remember it. You've just stuck it in there. But it also then allows to find that missing data. Now, for any of you who do direct data abstraction into the NCDR, you realize when you're done with a patient, you hit that quality check, and it will tell you if it's went through, if there's any errors or warnings. But it doesn't always tell you everything that's missing on that part of it. And after a while, you know, I was clinical. I didn't have any background, really, in doing analytics and spreadsheets. And this was all new to me. And so it took me a while to figure it out. And what I finally figured out, with the DQR was far more than just a pass or fail thing, that you could actually click on that pass and then filter it from failure to missing. And voila, you have all this missing data. So I was able to clean up my data and get it all put in there very quickly. And when I do this weekly, you know, it happens, you know, in a far shorter period of time. So I will echo it, submit often, submit early. It helps to clean that data and get it good. Shell, can I ask you, how do you use the NCDR metrics to report out to your Q-centric partners? Well, thanks a lot for that. Well, as John mentioned a little bit ago, you know, it's very important to know your metrics. They have very different patient populations between the registries and between the metrics themselves. Your numerators and denominators have specific exclusions and exceptions. So you really need to know them and then read your audience, grasp their level of knowledge. And, you know, at that point, it provides a little bit of opportunity for educating who your audience is as well. Most of our partners at Q-centrics are very engaged in their data. A few, not so much. But, you know, there's one common thread throughout all of them, and that is that they really don't have the staff to perform the abstraction and then someone to help make sense of it all and point out the significance of the data. So using the NCDR dashboard, I put together a scorecard and it's really a place where our partners can go or I call it a one-stop shop, if you will, and it outlines all of the NCDR metrics for that specific registry. The NCDR executive summary metrics, I should say. If it was all of their metrics, it would be pages long. But the executive summary metrics are all on one page. And then next to those, I have the 50th, the 75th, and the 90th percentiles. And then just to the right of that, I enter, or I used to, now I'm a clinical education manager, so I've got other people doing this, but we enter the partner's data for those four quarters at least. And if they are trending in a negative fashion, we might go back even further just to point out that this trend has gone on for a bit of time. So we then summarize their results. And with a new partner, I always begin by comparing them to the 50th percentile as Kristen mentioned a little bit ago. And then once they've achieved a trend, a positive trend, for at least four quarters, we might discuss bumping them up to the 75th percentile. And then finally, the 90th, or if they want, the 100th. That's their choice. And then that gives them goals to attain. And this often ends up, at times, confusing some of our partners because some of their metrics are off. And so what I mean by that is they're below the 50th percentile. And I do caution them that in many, many cases, and you want to be aware of this, if you have a small patient population for any particular metric, let's say there are only two patients that fit into a metric and one is a fallout, well then you're probably at 50% there, if I did my math right. So they don't quite like seeing that 50%. So you really do want to make them aware that hey, there's only two patients in this, and that does make a difference. Scorecarding our partners really does give them goals to attain. And it's nice when they have that sense of achievement, but I always point out to them that the end goal is really that we've improved their patient care because after all, it's their data, not ours. And so it's very important to make them understand that yeah, your numbers look great, but guess what? Your patients have better overall care. And you can actually, there's an example of one of our scorecards in the poster presentations, so you can look for that. It's in the CATH PCI registry track. So Lisa, so would you give us an example of how knowing the metrics and the difference between the registry metrics, for example, guideline-directed medical therapy discharge medications, how does that address the total care of the patient? And furthermore, how does that knowledge build trust in the relationships between yourself and those that you report to? That's a very good question. It's got a lot of parts to this question. You need to know your metrics. You need to know what they want, and you need to understand each of the different registries. And it all goes back to the other question that I had, is when you report these metrics out to your senior leadership and being the physicians and everyone else as well. The metrics are based on best patient care. So what's best for the patient? So when you're looking at the guideline-directed medications, that means that if you don't give an aspirin, that's not good care. So you notice that one aspirin, even if you have a high N, is gonna take you down into the red. So you have to explain this to your physicians that you have to document these things. Everything needs to be documented, especially when you're looking at the goal directives. Our patients depend on us giving the best care. ACC and the NCDR set up all the metrics so that they're showing what's the best care for the patient. If you present that to all your physicians, your leadership, nursing, everybody who touches the patient, that's how they have to think about this. So when they're looking at a patient who may not be a STEMI but might be a low-risk patient over here, they still need the same medication that you're gonna give this high-risk patient. If they can't take that medication due to creatinine being high and renal failure or something like that, that needs to be documented. And as Donna has often pointed out to me, is just because somebody says they're allergic to one of the statins, there's many others, Lisa, that they can try. So it's that thing that you have to bring back to the physicians, and that's where you have to gain their trust. Because if you can call your doctor or call your nurse practitioner or whoever and say, hey, look, Mr. Smith's going home on something but he states he's allergic to this, can we maybe try him on this? And then you also have to start looking at your dosages. In one registry, they're fine. In the other registry, they may not be fine. So this is where knowing your registry, knowing that you have to have a good collegial relationship with your physicians, your APPs, so that you can make that phone call and say, hey, dude, we gotta talk. And so when you do that, it builds that trust. They're gonna believe in you. And then when you send out the numbers, they're gonna understand it. And then when you send out your aggregated numbers to them later and you have one of your doctors who's looking at their numbers, because if you don't realize it in the cath PCI, you can pull up your provider numbers and send it to them. And if anybody is competitive, it's cardiology. And if you show it to them, they're gonna be like, well, how come he got like 100% and I only got 50? You didn't give an aspirin. So when you start doing that, it kind of brings that competitiveness out, but they're also gonna start reaching out to you and asking the question, what am I doing wrong and how can I do this, what is best for the patient? And always keep that in mind, that everything that we're doing here in the NCDR is what's best for our patients. So David, do we have any questions? Great information. Thank you guys very much. We do have some questions. I just wanna make one clarification though, that anytime we talk in terms of NCDR ACC of timeframes, 11, 59, 59 PM, whether it's call for data, whether it is any kind of meeting or webinar we have, that's always Eastern time. So just make sure you know that we're posting it. We're posting everything on Eastern time. So we do have a question from Hazel. She wants to know, how do you get your doctors engaged in monitoring AUCs? Most are not interested in discussing these metrics as much as the rest. I'll tell you that's a bane of my existence. And we are a tertiary type of hospital. So we get a lot of people who drive in for five, six hours away. So my doctors are, they're like, well, they're already here, let's just cap them. And you're like, but you didn't do any testing. Trying to get them engaged, I'm not gonna sit here and lie and blow smoke around. It's gonna be hard to get them engaged because the doctor is gonna do what's best for the patient. A lot of your doctors are gonna look at these people. They have the clinical judgment and they're gonna do what they feel is necessary for the patient at that time. It's kind of one of the areas that I'm learning and my director's learning and some of our senior leadership is learning. We're gonna have to deal with that slowly and individually because we cannot expect somebody who's just driven six hours to come in and say, that's great, we think you do have this problem. We're gonna send you home and bring you back later for a stress test. They're gonna take them to the cath lab and they're gonna look. I don't know if any of you guys have any other options that you do. We targeted, I mean, I used to be an abstractor. I used to be one of you guys. I do have to say to get that engagement, I targeted, this is gonna sound ageist, but I'm not, but the younger ones because they're fresh, right? They're eager, they're just learning and they are ready to get on those committees and be the steering committee chair and make change. I think at the time where I was in the hospital, it was, I don't wanna say the older ones, but the seasoned cardiologists, they're almost out. They're kind of going into retirement. So they weren't as engaged. So that was my trick. Yeah, I think to that point, one of the things that we did was we added the AUC to our scorecard. So besides the executive summary metrics, we also have the AUC metrics at the bottom of the scorecard. And we do delve into every single case that comes back inappropriate. And we also then go back to the facility and engage that partner and the cardiologist, interventionalist, director of the cath lab, whomever, into why did that case come back as inappropriate? And many times, as you said, it's because they didn't have non-invasive testing done or they weren't on max medical therapy. So it's very important for you to know your AUC. And I would encourage all of you to go into the documents found on the NCDR website and look at those AUC, not just for your acute chest pain patients, but also, more importantly, for your stable patients. Because that, I have found in the many years, since 2005 that I've been doing this, I have found that it's the stable patient that really does come back as inappropriate most often. So those are the ones you really want to narrow down and talk to your facility and say, hey, look, these metrics are here for a reason. We are looking at safety of patient because, let's face it, you put a patient on the table, you stick their groin or you stick their radial artery and there's risk, right? So you want to do everything you can before you put that patient at risk. And that means stressing them, putting them on max medical therapy, as well as taking into account their other comorbidities. So I do find that the AUC metrics are very important when we're talking with our partners. Thank you, guys. We are fast approaching the top of the hour, but I think we can answer one more question real quick if you keep it kind of brief. Christy wants to know, they're always interested in benchmarking the factors of annual abstraction volumes per full-time employee. And she wants to know if anybody is willing to share your average annual abstraction per FTE in under a minute. And the answer is no. While they're thinking, I did want to add something, David. When you're talking about AUC and the guidelines, not every patient is going to fit into that neat box. You have to remember that. There will be patients that don't meet the guidelines. There will be patients that cannot be mapped to the AUC. And the key thing with that is to make sure your physicians, your NPs, whoever the clinician is, that they're documenting accurately why they made the decision they made to implant a device or to do a procedure outside of the guidelines and outside of the AUC. And if you read some of these papers, those statements will be within those papers. It's up to the discretion of the physician. So you have to remember that as well. Just like Lisa said, it's really safety of the patient comes first and what is the best avenue for that patient. Thank you, Denise. Thanks to all of you for attending this session today. I just want to see a raise of hands. Did anybody learn anything valuable, take anything valuable or things that you didn't know? So good. We're kind of hitting the vein of what you guys are wanting, hopefully, in terms of information that we're sharing. So that's important to know. We really appreciate your attendance. We also are gonna have a session this afternoon from 1.15 to 2.15, and it's gonna be different information and different topics. So make sure you come back.
Video Summary
The video is a panel discussion about the NCDR (National Cardiovascular Data Registry) and its metrics. The panel includes experts from different organizations who discuss various topics related to data submission, communication, engagement, and reporting. They emphasize the importance of understanding the metrics and guidelines in order to improve patient care. The panel also mentions the availability of detail lines on the dashboard and the significance of submitting data frequently to identify and address any issues promptly. The panel members share their experiences and insights on engaging doctors, department heads, and administrators in monitoring metrics, reporting outcomes, and implementing quality improvement initiatives. Overall, the discussion highlights the value of data-driven decision-making and the collaborative efforts to improve healthcare outcomes.
Keywords
NCDR
National Cardiovascular Data Registry
metrics
data submission
communication
engagement
reporting
patient care
dashboard
quality improvement
×
Please select your language
1
English