false
Catalog
Use of NCDR Data in Physician Contracts- 2024 Qual ...
Use of NCDR Data in Physician Contracts
Use of NCDR Data in Physician Contracts
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Good morning, everyone. Welcome to this morning's session on the use of NCDR data in physician contracts. My name is Callie Kalna. I am with the NCDR. And our speakers today will be Amy Lancey and Dr. Charles Lambert. Please use your app to ask any questions. We will be having a Q&A at the end of the session. And I'm going to pass this off now to Amy to kick us off. All right, good morning. My name is Amy Lancey. And I am the heart and vascular program manager for Common Spirit Health's Mountain Region. Prior to taking this role back in September of last year, I was the program coordinator for St. Anthony Hospital in Colorado since 2013. And I was responsible for data abstraction of the chest pain MI, cath PCI registry, LAO, AFib, as well as the STS adult cardiac cases and Get With The Guidelines heart failure. Currently in this role, I oversee the Mountain Region of Common Spirit's cardiovascular programs and work with the coordinators at each of those facilities to ensure they're functioning at their highest level. My background in nursing includes four years in a medical ICU-PCU. And I am an affiliate faculty member at Regis in Denver working in the nursing skills and simulation labs. I have no disclosures. So Common Spirit Health has more than 2,200 care sites in 24 states coast to coast. We have over 35,000 providers across the system of hospitals and clinics and over 45,000 nurses. The Mountain Region itself has over 21,000 caregivers, 240-plus physician or provider practices and clinics, and 20 hospitals. The Mountain Region consists of 12 Colorado hospitals, three in Kansas, and five in Utah. You might have heard the name of Centura Health, and that merged last year into Common Spirit. So before we could really start looking at the key stakeholders and the metrics within NCDR, we first had to review the process of how our data was being collected and how it has been collected. So this kind of gives the steps in the ideal process. So you start with the clinical work at the patient's bedside. There's data abstraction from the EMR that is then validated by the coordinators and the physician experts, and that's a continuous cycle. We submit the cleaned-up version data to the registries, and then that data becomes available when it's published. From there, it is disseminated to different parties, including CMS and private payers, public domains for marketing and physician recruitment, and then internally within Common Spirit, our quality committees and councils for QI and PI projects, as well as to our Common Spirit physicians for research and quality incentive payments that we'll talk about today. So this slide here helped us take a deeper look into the data abstraction process, along with the quality improvement process. This should be a continuous cycle. So data is captured in the EHR during that clinical encounter. It's then abstracted. We currently outsource our data to a third-party abstraction company. So our coordinators give a list of patients for abstraction, depending on the registry, on a weekly basis. The party abstracts that. They then inform us of any fallouts that they are seeing during the abstraction. And our coordinators then review and adjudicate with our medical director. And then it's sent back to the abstraction company to make the corrections if it was just a misdocumentation. From there, we utilize the benchmark reporting to set our standards. That data is then analyzed by our clinical councils. And you can see we have several different clinical councils. We utilize the performance improvement and review to achieve center of excellence designation. That's something specific to Common Spirit for various registries and hopefully increase revenue. So about eight years ago, our cardiology group signed a agreement with three of our hospitals in the Denver metro area to be the sole cardiology group that was practicing at those hospitals. Since then, a joint agreement was created and was determined the cardiology group would be paid out on a quarterly basis on pay for performance metrics. Unfortunately, those metrics had not been updated since 2019. So when our new CV service line medical director as well as our CV service line director came on board last year, we were tasked with redeveloping the joint agreement and the incentive metrics. So first, we had to determine who our key stakeholders were for our joint operating committee. So it was determined to have the cardiology CMO, the cardiovascular service line director, the cardiovascular service line medical director, and then myself. From there, additional providers are included at our quarterly meetings. So as a joint committee, we reviewed the metrics from the NCDR dashboards and decided to look at and select metrics that we have had consistently lower performance but would have attainable process changes. The list of metrics were reviewed by the cardiology group, and they picked the ones they felt were the most attainable to achieve for 2024. We decided to select no more than five metrics to start this new agreement out with because we wanted to show CommonSpirit was not only going to be a good business partner by paying them out on time, but also selecting metrics that were mutually agreed upon that we would see impactful outcomes to our patients. So with that, we selected two metrics from cath PCI, 4449 PCI for STEMI within 90 minutes, and metric 5000, composite major adverse events for our select PCI patients. These were both selected because we have a pretty simple process in place of reviewing and adjudicating any fallouts in a timely manner. And then we selected two from our AFib registry. We selected 2342, proportion of patients with a completed AFib effect on quality of life questionnaire, and metric 2792, patients undergoing an AFib ablation who experienced any adverse event intra- or post-procedure and prior to discharge. The first metric for AFib was selected because our cardiology team felt it was a simple process change. We were not doing it, and so they determined that the patient would be given the quality of life questionnaire in the pre-op area before their ablation, and they would fill it out. The cath lab team was educated that they needed to treat the questionnaire just like a consent form. So if it was not completed, then they were not to bring the patient back to the lab. From there, the completed questionnaires are put at the nurse's station of the pre-op area, and our two coordinators that are here pick them up on a weekly basis, and we scan them into a shared Google Drive for our abstractors to then go back in and capture those. There were a few additional metrics that we had to put in for purchasing compliance to align with common spirit. So we have one purchasing compliance for a cardiac rhythm management. They must be utilizing the dual vendors 90% of the time, as well as the same percentage for stents and balloons. So once the metrics were selected from the cardiology group and were reviewed by the committee of the key stakeholders, they were then sent to the valuation firm to determine the weight of each of the metrics. And since we were trying to start off on a good, clean, easy foot for the first year of the new agreement, it was determined all five metrics would be weighted out at 20%. We have a spreadsheet that the CV service line director utilizes to determine the tiered payments. So there's a tier for each of the percentiles, 50th, 75th, and 90th percentile. And then you can also see on the screen that there's an annual incentive and then what it would be on a quarterly basis. So they know what their max payout could be. The one thing with metric 5,000, the analysis assumed the health system would only be compensated if the cardiology group had performance that was better than historical performance. And so you can see for quarter one of this year, they did not meet that criteria because they were not performing above that historical performance. So they received a $0 payout there. And then with the proportion of AFib patients with the quality of life questionnaire that enrolled February 1st cases, so there is a little bit of a lag there. But they received the payment based off of the metrics of the patients that were submitted. And then for our purchasing metrics, this is really an all or nothing. So if they're within that 90% compliance, they get the full amount. If they're not within the 90% compliance, they get $0. From there, to help us track and trend the metrics, we created this incentive metric dashboard. So prior to the submission deadline, I pull the information from the dashboards and work with the coordinators at each of the facilities to ensure that the outliers that are currently being reported are true fallouts and not just missed abstraction. If that is the case, then we are able to go in and make that change, and we'll check it the following Monday when it re-aggregates. And then once it's published, this data is updated with any changes and shared with our provider group at that quarterly meeting. I also include patient identifier information so they are able to go back and review what patients caused that fallout from there. So for our review process, the dashboard is reviewed at a quarterly time frame at the joint commission meeting, and a discussion is had. And once the data is inputted into that worksheet by the CV service line director, the payment is then issued out to the cardiology group once that data has been published. So moving forward, as we found going forward, there was some real big concerns coming out of metric 5,000. There was a feeling that it may not have been the best metric, even though it was selected by them for an incentive payment. So we are currently reviewing the patients in that metric and determining how we're going to move forward with that. And then we'll have an annual review this fall to review the current metrics and determine if we need to retire any of them or if we need to change the weight. And if we do retire any of the metrics, then we will select new metrics for 2025. Once that is complete, we will submit those over to the valuation firm and have them review and approve and provide us with the new weighting of those metrics. So mine is short and sweet, and I hope that that helps you all learn how to review your data to utilize physician contracts. Thank you. Alright, do you want me to take off? Thank you. My god, Amy is here. I've got nothing to disclose. So my name is Chuck Lambert. I'm a cardiologist. If you guys went to the opening plenary session, you heard Dr. Bott talk about when she was a beginning fellow. So I was halfway through my career when she was a beginning fellow. So I was doing this kind of stuff before the NCDR even existed, back when if you had an anterior heart attack, your mortality was 20%. And all we did was put you in a bed and give you nitroglycerin and oxygen, if any of you remember those days. My position is as a medical director for ADVENT Health, the West Florida division. A little bit of an introduction, ADVENT Health is part of the Adventist health system. So if you've been around the US and seen various labels on hospitals and providers, there's an Adventist health system. There's ADVENT Health. There are places like Loma Linda, which is a part of the Adventist health system, but not labeled as ADVENT Health. It's all part of the Seventh-day Adventist church or connected to it. ADVENT Health exists in these states. This slide is a little bit dated. We have about 60 hospitals now. It was rebranded from the Florida hospital system several years ago when we changed CEOs for the corporation. I'm from the West Florida division. Down there, these are the various divisions. It's in the orange at the bottom. The West Florida division spans from Ocala, which is just south of the University of Florida, all the way down, essentially, to the Everglades in Sebring, where there are three hospitals. There are 14 hospitals now in the West Florida division. There are, I think, 36 hospitals within Florida that's part of the ADVENT Health system. The biggest cardiovascular provider as a system, actually, in the United States is the Orlando contingent. They cheat because there are eight hospitals operating under one provider number, but huge, huge volume of cardiovascular patients within our system. If you went to the plenary session, you know, Dr. Bott talked about AI, so I decided to start this talk by asking chat GPT. Tell me about using quality data in physician contracting. And I urge you to do it because it's amazing the answer you get. It's about four pages long. It's better than I could have written. These are just the, you know, the highlights of that. And I'm not going to talk about all of this, but it's amazingly accurate, even the emerging trends. There are, you know, the way, the reason we do the contracting connected to NCDR is, of course, to engage physicians, physician alignment, which is, I'll talk a little bit more about why we use all the vernacular for process improvement and doctor contracting, but it's basically to get the doctors to focus on quality. In the background, as Amy said, increased volume. It should be targeted at value, you know, which is really the ratio of outcomes to cost. Some hospital systems have been doing that for a while, Intermountain Healthcare, for example. You have to have clean data, and the data that you can apply, I've listed on here different models for physician alignment. Amy talked about a pretty classical one where you have a group of physicians, you have a contract associated with them, you put some NCDR metrics in there, maybe some process metrics, some other things, and come up with an FMV value for those different metrics, weight them, and they get a financial reward if they do what you tell them to do. It's difficult in different types of physician alignment models, and you're probably familiar with most of these. If you're an ACO, much more difficult than an employment model. If you have a CIN, a clinically integrated network, very loose physician integration. Various PHOs, and I'm sure you've seen these, they're also pretty loose, just because of the heterogeneity of the physician population you have to deal with. Employment seems like it would be the best. The most complicated employment model that I've ever been involved with was a university employment model. Once we had a committee at the University of Florida tasked with redoing the physician alignment employment contract structure for all physicians, about 6,000 of them within the Florida system, that thing came out and was two inches thick after three years, but it was an employment model. What Amy was talking about is a more classic, pristine, the worst are academic, the best are sort of one-to-one. Although I'll tell you, today I was on the phone with two doctors, okay? One was a young, and this is 100% true, a young, interventional cardiologist that's part of one of our groups. We have employed physician groups, not at all of our hospitals, and we'll get into this in a minute, but most of them. So an employed cardiologist who's been there for four years, young guy, makes a ton of money, contracts up for renewal, right? Has a contract just like Amy's folks. He calls up in, I guess, about six months or so ago, tells the president of the physicians group, he wants more money, he wants no quality metrics in his contract, right? And he says, listen, all the doctors in Orlando, they don't have them in their contracts. And yes, they do have them in their contracts. And bottom line is we're not renewing that guy's contract for two reasons. Number one, that statement, he doesn't want to use any of his own specialty society's metrics. He thinks he's above that. And number two, a couple of key NCDR metrics like major adverse cardiac events for all comer PCIs are out of range with the rest of his practice folks. And there are other personal, I got two of my people sitting right up here and they're going. In fact, we'll ask them questions in just a minute. I told them, Tracy there spoke at this meeting last year, but his contract's not gonna be renewed. And it's all pretty much based upon NCDR metrics. The other phone call was from a young man that I'm recruiting, who has all those metrics that we hold dear in his contract and accepts those as the Bible. So one's going to be out, one's going to be in based upon what we're talking about here today. Co-management, which is next to last year, I'm going to talk a little bit more about that. It's a more complex model. Some of you have probably been involved with co-management models. If you have a bunch of employed doctors, fairly easy. If you have lots of hospitals with employed and non-employed renegade physicians out there, but you still want to control their practice patterns, co-management is fairly widely used. It's pretty easy if it's in one hospital. I'm going to show you how we implemented it just for demonstration in seven hospitals using NCDR metrics. The outcomes of that and how we transitioned that recently. There are hybrid models of all sorts. And then, of course, more recently, JV models, where you're joint venturing, let's say, a metaxium ASC. Those are usually tied to quality metrics as well. So an employment model for a contract, as Amy outlined, should be simple, right? So at least in our physician groups, those gray boxes there are pretty much standard elements that go into a physician contract. Inpatient data, which may be NCDR data, other data. We all have data sources that hospitals use to report to the government. Advent Health uses Premier data, which is really an administrative data set. It has some clinical data in it. Your CMOs, usually under med staff procedures, will gauge the performance of their physicians using that type of data. Sometimes NCDR or STS or other clinical databases. But those come off billing sheets. They go to Medicare. That's where CMS grades you. For me, totally useless. We use NCDR data almost entirely in physician contract. Procedural metrics, operational metrics, usually we tie these to percentile rankings, which if you really have a problem with fair market value assessment of your contracts, much cleaner if it's tied to percentiles than even historical values. If your historical values are really high and you're giving money away, as perceived by lawyers, you can get into issues with Stark and other things. Co-management is really a model that is widely used, developed years ago. Physician interests are the, I'm not going to read all these, are to maintain autonomy. We have physicians in some of our hospitals who will never get in bed with a hospital, no matter what, because they've been burned in the past. They want autonomy, even though more and more cardiologists and other CV specialists are being employed these days. 60%, 65%, depending upon where you are. And hospital interests, which you're all familiar with, which are to improve quality, really improve value, and of course, grow your service line. Co-management is a model where, in our implementation of this, when we first did this as a part of the Adventist health system, we had a previous co-management model in some of our hospitals before that. We wanted to come up with a service line model that would engage employed and non-employed physicians of all ilks. You name it, we have it in seven different hospitals and involved roughly 100 doctors. So cardiac surgeons, cardiologists of all subspecialties within cardiology, put it together as a primary performance improvement, not a contracting agency, as Amy was talking about, although we've been down that route. That's really a corporate thing within AdventHealth. It's not really amenable to contracts, per se, because it's been corporatized. So to do that in seven different hospitals in seven different parts of Florida took about a year and a half. To get all those physicians who are not in the same practice together in a contractual entity, we set up an LLC, which is in the middle of this slide, called the Physician Co-Management LLC. So that requires, obviously, a bunch of legal machinations, setting up the corporation, getting a joinder agreement together, getting all these physicians to form a board and sign a contract. And contracts have to go to each hospital. The basis for payment, on the left side of the slide, is there's a management fee, which is sort of a base management fee, which is a time-based thing to get your, let's say, your cath lab director at Sebring Hospital to come to meetings, to do work that you want him to do work, just as Amy's model showed you. And it's an hourly incentive payment, right? So it's a time card-based thing. The incentive fee usually can't be more than 1.5% of the net value, net income for your service line. So depending if you're a billion-dollar service line or a half-a-million-dollar service line, that will get through an FMV valuation without too many red flags going up. And then at the end of the day, just as in Amy's model for an employed physician group, you assess your metrics going forward. We used and continue to use a percentile ranking where we can. And payment comes from that model to embellish the whole co-management perspective a little bit more. When we started this, we set up groups for patient experience, education, research, business development, et cetera. Really what we're talking about here is just the quality improvement portion of this. So you can't see it. You can't read it. I put up a spreadsheet because Amy put up a spreadsheet. I saw her slides before she saw mine, right? So here are the metrics. This is way back in time. This is when we started this whole thing. And I'm going to show you how it evolved. So you can see these things on the left side of the screen. And they're the usual metrics that we all choose, right? So some of them are STS metrics. Some of them are NCDR. Some of them are our data. As I mentioned before, we changed CEOs. Our new CEO for the whole company is a finance guy, right? He's not a classic hospital administrator. So his targets for improvement across the whole US are CMS 5-star, leapfrogging, the usual kind of stuff. So these AHS metrics, Adventist Health System metrics, are out of that bailiwick of performance improvement data. And you can see PCI in-hospital risk-adjusted mortality, discharge medications, the 90-minute time frame for STEMI, PCI risk-adjusted AKI. I saw a session here on that. It never freaking goes away. Nobody can seem to give anybody any IV fluids. It doesn't matter how much you beat them. And then these things on the bottom, another aside for Adventist Health System, we were the largest client of Cerner until about two, I guess, two and a half years ago. Then we decided, not me, we decided to become the largest client for Epic. So over this two-year period, and I've co-chaired this thing, all of our hospitals have gone through this tremendously difficult transition. And a lot of these Cerner things down here, like power plants, power plants, or order sets, they're called something different in Epic. And standardizing, I don't even want to get into this, maybe in the question part of this, standardized reporting, forget artificial intelligence. Just getting the doctor to put the Timmy flow in a box, I guess that's real intelligence. I don't know if it's artificial. Let's do that first before we ingest the data. We've got an AI platform, Viz AI. It's ingesting a bunch of garbage right now from Cupid SR. Nevertheless, I'm off track now. And then there are reading compliance things. One of the bonuses we gave this 100 group of physicians was the ability to read all the electrocardiograms, tens of thousands a day of these things. And we set up a company to bill for them. So that's a huge incentive for these guys. But we asked them to read them within a day. Huge, huge thing to do that without fellows. So all these were the initial metrics. And then each of these columns, like FHC is Florida Hospital, Carolwood. Florida Hospital, North Pinellas. And the number of cases they had. So they get a score based upon individual hospital performance. And then there's a regional score over here in a region column. So that's the same metrics, only adjusted for their volume. So if AdvanHealth Tampa has 1,000 cases and does poorly, it drags the entire model down. So that's adjusted for their graded on 50th, 60th, 75th, or 90th percentile. Our corporate under our new CEO, our corporate goal for everything that we do is to be in the upper quartile, no matter what it is. So we structure our contracts that way. And this is what happened based upon money. So this is way back in 2015 to 2018. And there are meetings that happen now, put it in the back of your mind, every month with medical directors at every hospital who are paid on an hourly rate, like we talked about earlier. It's all fed back. We hired Tracy right there. At the end of this whole period, when we transitioned to primarily an ACC-based model, which I'll get into in a second. And you can see they started with no incentive. And they increased it about a log. And then it kind of plateaued. So compared with 2015 to 2018, we did get performance improvement in most of these metrics. But we kind of squeezed the orange as much as we could. Weren't really moving the needle anymore. This was about the time when the whole ACC performance improvement program for institutions was congealing, if you will. And we went to our leadership and decided to transition our performance improvement model for the West Florida division, not the whole company, from a co-management model to more of a service line model. Now they call them all institutes. In Florida, you can throw a rock and hit an institute. Sometimes it's in a big hospital. Sometimes it's in a trailer. It just depends. Buyer beware. But if you can see here for the red arrow, the quality incentive program under co-management transition into the ACC institutional performance improvement accreditation program. And the way that we initially structured that was at that time, we were up to 12 hospitals or so. We decided that every hospital, no matter what size, would have three ACC accreditations. And some of them didn't have even abstractors at those hospitals. We would acquire them from Central South, which is Orlando, and start all over again. Tracy can answer questions about that. That's her right there. But they all had to have heart failure accreditation. They all had to have a cath lab accreditation. They all had to have chest pain MI. At that time, we had two open heart programs within our division. We have 16 of them company-wide. That's a different story. But they have their own. We've started their own process improvement, called the CRC, or Clinical Reliability Council, for them. Those two open heart programs in the West Florida division, in addition to that, added TVT and LAAO. We also had EP accreditation. We're adding, I think, two more open heart programs shortly, so they'll be the same. So with that whole thing, and I'm going to sum up here. This is Tracy up here. She talked here last year. The way we do this now, we're not a co-management process anymore. There's one residual of the co-management program, and that's the EKG reading contingent of doctors. And to be perfectly honest, the reason that we retained that was to retain a little bit of physician incentive in kind of the outlying areas where they did have problems. They still have problems with reading times and things. Plus, it's a financial incentive to them. For our employed cardiologists, just as in Amy's model, their incentive, their PI incentive, is based on the NCDR metrics that you saw. We do have operational metrics. We get those out of the NCDR database as well. The non-employed medical directors, so these are private physicians, they are incentivized by individual contracts that come out of the division office. So all hospitals have three directors, an invasive, non-invasive director, and what we call the accreditation director. They're kind of a little bit iffy directors, but they all have an hourly rate. They're supposed to come to meetings. It's an issue currently that Tracy's going to figure out how to fix. Cardiothoracic surgery, this is a cardiology meeting. They have a quality incentive based upon STS data. The whole company finally has the STS itself giving us digital data. We have a group of analysts at Celebration Hospital in Orlando who put all of our open heart programs in a very, very nice Power BI database. All the NCDR data is fed into a Power BI database currently. It's not 100% complete, but it's getting there. And all the metrics, regardless of what model they're in, they're employed, non-employed, surgeons, non-surgeons, haven't gone to vascular surgery yet, are based upon percentile informed performance. So with that, I will quit. Is that good? Thank you. We do have quite a few questions that have come in. First one is for Amy. Are physician level outcomes reported in blinded or unblinded format? Because it's the same group and just three hospitals, I do put the physician name on the outlier. This is for Dr. Lambert. My hospital has a recent need for a locum interventional cardiologist, and it's negatively impacted our metrics, adverse events. How do we ensure quality with locum physicians? Good question. You ought to embed it. Dealing with these locum companies is a pain. It ought to be embedded in the agreement with the locum's company. The problem with the locum's folks that come in and come out is, I mean, you're abstracting in the past. So they can come in for six months, trash your program, kill a bunch of people, and then they're gone again. And we don't have to use locums. Some of the other hospitals in our company do, and their upfront contracts have the same type of metrics. The key is getting timely data. If you could get data, your AI ingestion machine to give them data every single day, that would be great. But that's about all you can do. Other than having a strong cath lab director and having the manager of the cath lab call you every time something goes wrong, which happens to me anyway, and I'm not the cath lab director. Sometimes you just have to go in there. This guy I told you I'm not renewing his contract, the tipping point there was the cath lab manager called me, and I went in there and watched what he was doing. That was it. So locum's folks are harder to do that with, but the short answer is put it in the contract with the company. I think this question is for both. How do you leverage performance incentives for positions with recruitment to work in less desirable geographical areas? Go ahead, Amy. I think for us, it's still so new, and they are a pretty solid cardiology group. They've been there for a long time. So we haven't leveraged it there yet, but I think once we get it a little more honed in, they will utilize that for their recruitment. Yeah, we have two places in our division that are hard to recruit to. One of them is in the Everglades, and there are three hospitals down in Sebring, Florida. And if you go down Highway 27, this is getting into the weeds, it's way down there. If any of you have any cardiac surgeons that want to come who like python hunting and alligator hunting, that's where we need them. The pathology down there is ridiculous. There's a huge cardiac program. It's headed by a really, really fine cardiologist there, but it's hard to recruit down there. And the other one is surprisingly up in Ocala, which is right in the middle of the state, right next to the University of Florida. Great place to live if you have horses. So in my recruitment things, I always put the horse thing in there, because most of them have daughters that want to ride horses, right? But it's still hard to do that. And we have a closed cath lab there, which is rare as well. The woman I report to, we just got one retired and another one came in, senior executive officer for strategy and growth for our division. She spent half of her career at HCA, the other half at Tenet. When we talk about recruiting and I tell her, hey, it's going to be hard to recruit a surgeon for down there, she says, listen, Chuck, you throw enough money at them and they will come. That's one way to do it, and unfortunately, that's kind of what you have to do. Our, I want to call them HR department, the negotiators within our physicians group automatically adjust for those two areas. So dollar value per RVU has to go up. Medical directorships are incentivized, and for now, that's what we have to do in Florida. Maybe I know it is different in some other parts. We have hospitals all over the place, and they don't have to do that, but a lot of it is geographic. Dr. Lambert, how do physicians see their own data and how often? So when I started doing this, I was in a different system, running a system along the space coast for the University of Florida, and that is a no self-pay rocket science community. Every patient is an actual rocket scientist over there. It's all along the Kennedy Space Center. The doctors over there were, you know, fat and happy, but they constantly hated each other. I mean, they were not employed. They were all looking at everybody else's data, and I learned then, unlike Amy's group, which apparently are very well-behaved, never to show these guys up on a screen, Tracy and I have been talking about this off and on, with anything they could figure out. So I would put their performance data, and we reported it quarterly, kind of comes along with your abstraction reporting schedule, put it in an envelope, seal it, and hand it to them. And then they would all come to the meeting, because they were all jockeying for position, and they would look up at the anonymous data, and they would figure out who they were. I mean, it was that kind of dysfunctionality. So quarterly, unless they want it, you know, like this guy whose contract is not being renewed, Tracy just gave them his data, because he's looking for jobs, right? But it's quarterly, it's de-identified in public meetings, and obviously we give them their own data. And that goes for the surgeons as well. Amy, thoughts on using metrics that are publicly reported, such as discharge medications for cath PCI for physician contracts? That was discussed. They do pretty well on their discharge meds, and so they wanted to focus on other metrics that were lower performing than the meds. I'm pretty sure this next question has been answered by you, Dr. Lambert, about recruiting to undesirable locations, where part-sponsors are, so. Dr. Lambert, teaching facilities, has it made it more complicated, because it's tough to get clinicians to agree which supervising physician will own the metric outcomes when they're all-outs? Yes. Okay. Remind me to say some more. So, I've spent most of my life in teaching facilities, you know, we're standing up cardiology fellowships all over, everybody wants to be a statutory teaching hospital now, right? It's for quality, this is my sign. And that's an issue, especially, let's take discharge medications, who actually, if there's a culprit in the discharge medication cascade of things, you know, all hospitals are a queue, you come in there, hopefully you get out alive, a whole bunch of stuff happens in the meantime. When that discharge med rec is done, and the patient goes home, who's responsible for that, right? Unfortunately, in most hospitals, it's not even, the cardiologist doesn't really do it, the fellow doesn't really do it, it's the hospitalist that does it, who sometimes doesn't even know who that patient is. There's a patient list that comes out in the morning, you know, with 60 patients, 100 patients on it, and they go through, and it was a problem with Cerner for us, I think it got a little worse with Epic, and we have people working on that constantly, but to assign responsibility for that, I mean, I think everybody, every hospital is a little bit different, you know, one size doesn't fit all. In Ocala, it's different than Tampa, it's different from Sebring for us. Some places, there's a very well-run subset of the same hospitalist group, which we use one really for the whole division, and the physician who's directing that group is right on it, and others, it's overwhelming for those folks to even understand, and it gets into length of stay, and hold pending cardiology approval, and all that nonsense, but you know, the ultimate resolution of that is to make the managing physician, however you label it in your hospital, the managing physician. So start discharge planning eight hours before it happens, if there's any questions, you know, the discharge planner goes to the managing physician, not the specialist, not the subspecialist, and they're responsible, and you know, for us, with the NCDR, if there's somebody that's had an MI, and they go home without an aspirin, and they come back in, sometimes you have to educate these people with an MI, you know, what's the mortality of that, you know, compared with the first admission, it's 50% higher, and a lot of, you ask a hospitalist, a lot of them don't even know that. Once you educate them, it gets a little bit better, but I think AI is going to solve it for all of us. That was a joke. Amy, do you inform the other providers what metrics are being selected for review? So they made that choice as a group. They reviewed the metrics on their own during their own meeting, and then came back with the ones that they felt would be the most appropriate, and then as a group, we picked the four metrics from the registry. Can I add something on that? The beauty of the, I'm going to pitch, they don't pay me, but the beauty of the accreditation process is that you choose metrics, and this is the reason we're doing it. You choose metrics that you don't do well on, that are below the median, then you work on those until they increase, and of course you don't turn your back on them, and then you choose another one, and you work on that, right? So it complements a lot of the employment models, or accountable care models, by truly continuous performance improvement. You don't just look at one forever, kind of like we did in co-management model, but it's the best model out there for performance improvement for CV services by far. Dr. Lambert, how are you dealing with outdated AUC and fallout for ICD appropriateness? You know, we have a huge electrophysiology program, and the EP guys are very, very sharp, and they're honest, so they put in their notes, and they put in their documentation what they're doing things for. You know, if it's an 18-year-old soccer player over at the University of South Florida who just had persistent ventricular tachycardia, and they put an AICD in her, that's not in our box right there, but it's the appropriate clinical thing to do. So we tell them that these are guidelines, right? It's not a rule, it's not a law, they're guidelines. You track and trend them, and if you look at NCDs, National Coverage Determination from CMS, that's way back here in terms of following a guideline. You know, what we do in the NCDR registries is we're intermediate, and then what really sharp physicians do who are chairs of HRS committees, they're ahead of us, right? So the correct answer to physicians is you use them as guidelines. You don't really tie their compensation to anything that's not solid, that they don't have control over, and is absolutely not only a guideline, but part of evidence-based medicine practice for what they do, right? 100%. Amy? For the AFib Quality of Life QI project, did the physicians also get a document that worked in the ACC professional data portfolio app for MOC credit? No, they have not. And then another question, Amy, does the provider review the outlier, or only the JOC committee? The provider does, so the coordinator at each of the sites, I work with them directly to let them know, here's the fallouts that we currently have posted on the dashboards. They'll do a chart review just to make sure it's not a missed data abstraction by the abstraction company, and if they're not finding any reasoning behind that, then they are meeting on a quarterly basis with the cardiology champion to review those fallouts. Dr. Lambert, any key lessons learned on obtaining buy-in from physicians to engage in cardiovascular performance improvement projects? Well, I mean, incentives, I would say number one, of course, depends on the personality you're dealing with. Incentives help. I mean, incentives can be financial, right? Incentives can be just professional, I want to say engagement in activities that are useful to them. Most physicians look at hospital meetings as horrible things they have to go to. If you have some physician leadership in wherever you are, hospitals, practices, that can make those useful to them, usually it's not a problem in academic institutions, it is in community hospitals, but the more you can align incentives, including professional things, journal clubs, residency clubs, whatever it may be, that's the biggest thing I think you can do. Money can go so far, they really don't need this kind of incentivization, it's good. Professional meeting in the middle of what you have to do as a practicing cardiologist as your life buzzes by, that's what you need, and for that you need leadership at a local level to do that. Could be you, could be a physician that you work with, rarely a CMO, I say that sarcastically but it's the truth, but if you have a local cardiovascular physician leader that can make a performance improvement setting useful for them, that's the key. Amy, you mentioned that the physicians are re-evaluating the use of the Cath PCI metric by 1,000, what are their quorums? So they feel that they, and I think we all hear this, they are treating sicker patients than what the risk models are showing. So they feel that they're not adequately capturing how sick the patients are that they're treating. So we're digging into those and really looking at those deaths to determine, we looked at is it an age thing, you know, is it a cultural thing in Colorado where maybe your grandparent is 87 years old, they're not going to make a recovery, and so families withdrawing care, because the majority of the deaths were comfort care deaths, not deaths in the lab. So we found that there were some that probably they should not have taken to the lab, and so now we're kind of broaching that conversation with them to say, well, if you choose to take these patients to the lab, even though you feel it's a disservice to the patient not to, the data is not going to look good. So they wanted to stop the metric right away, but it was agreed upon for the year. So we're kind of delving into that, but I'm sure that's something that everybody hears that term. Dr. Lambert, in the current model, how are non-employed physicians engaged in quality? So I briefly shot over that. So we have hospitals where we have no employed physicians, right? And they are incentivized, same meeting stuff, right? So the accreditation meetings, the directors, for example, for heart failure or the chest pain folks from the emergency room, cath lab directors, they're required to come and get engaged in those processes by contract. So they're not employed, but they have a contract. And we do that out of our division office, not the local hospital office. We could probably do a little bit better, and we've talked about this in making that a little bit more forceful. Some of the more remote hospitals, they have very infrequent meetings. Their cardiovascular division or section meetings are once a quarter. Bigger hospitals every single month, right? But it's important to get them in there, and they're engaged by virtue of that contract. The other physicians who come to those meetings are either they're interested in performance improvement or they're interested in seeing their own outcomes. I mean, some of them are truly interested in how they're doing, and they don't get any feedback otherwise. You give them this stuff that comes out of Premier on reappointment. That's completely useless to them. And having the PCI report card in front of them, that's useful to those guys. What percentage of salary is based on quality metrics? I don't know. So ours, the company per se is very sensitive to paying physicians, right? And if you Google, I think it's Florida Hospital, and you go back before we were part of them, there were some settlements, like a lot of systems have had, for an aberrant payment somewhere in a whistleblower. Like there was some dermatologist in South Carolina who was getting paid $100,000 and never came to clinic, and nobody was watching, that kind of crazy stuff. So we have this thing called PCRC, which is a bunch of basically administrators in a legal team that look at every single contract that comes out, and it's corporatized. So it's in our corporate offices. And the amount of money, for instance, I'll give you an example. I'm trying to recruit a new director of cardiothoracic surgery for our Tampa hospital, right? And we're trying to get somebody with a national reputation. I've been interviewing for a year and a half, I've got a guy targeted. The salary that he makes in Boston compared with the Southeast United States is there on two different planets. And the guy knows Florida, he knows this. But to try to make those match, even looking at MGMA numbers and adjusting for geography, that kind of stuff, usually what you use are medical directorships or scientific directorships. And those hourly rates, if you look at nationwide, a divisional medical directorship, the hourly rates range from about $223 to about $453, right? What you have to do if you're trying to get a guy like that is he's got a base salary, and that's going to be set based upon MGMA stuff. You can add a medical directorship that's rational. So let's say it's 10 hours, 20 hours, 25 hours a month, and that gives you some wiggle room up to a point. And then it goes back to geography and real estate prices and Boston versus Wesley Chapel, Florida, that kind of a thing. So that's what we use. There are companies that will pay out of the nose for two years, and everything then goes back to productivity, which to young physicians is very attractive because they think everything is rosy for the rest of their lives. I won't name those companies by name here, but that's the way we do it. So try to make it rational. We've all got to pass FMV. So it's not going to go above those limits that I just told you. You can do scientific things. You can do, if you have residencies or fellowships, you can pay them for that, but it has to be something actually real. I think we are out of time. That was applause for us, right? Yes. Thank you very much. Thank you all. Thank you.
Video Summary
In a session on utilizing NCDR data in physician contracts, Amy Lancey, Heart and Vascular Program Manager, detailed Common Spirit Health's approach. Lancey oversees cardiovascular programs and coordinates data abstraction across their Mountain Region. The process involves clinical work at the bedside, data abstraction from EMRs, validation by coordinators and physicians, and submission to registries. This informs various stakeholders, including CMS and private payers. The program uses performance metrics for physician compensation, choosing them collaboratively with physicians to improve lower-performing metrics.<br /><br />Dr. Charles Lambert, a cardiologist at ADVENT Health, discussed aligning physician contracts with quality metrics using models like co-management and the ACC's institutional performance program. The goal is to engage physicians and improve quality and value within health systems. Lambert emphasized adaptive contracting tailored to each site's needs, leveraging both financial and professional incentives to promote physician engagement.<br /><br />Both speakers highlighted the importance of tailored agreements and metrics selection, using data for ongoing performance improvement, and maintaining flexibility to adapt to physician and institutional needs. They addressed the challenges of recruitment, especially in less desirable locations, and managing metrics for locum physicians and in teaching facilities.
Keywords
NCDR data
physician contracts
Common Spirit Health
cardiovascular programs
performance metrics
co-management models
quality improvement
adaptive contracting
physician engagement
×
Please select your language
1
English