false
Catalog
NCDR Domestic and International Onboarding - Non-C ...
19.1 Lesson 11: Terminology
19.1 Lesson 11: Terminology
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Welcome to this series of instructional videos directed toward onboarding education for our NCDR international community. The content in this lesson was developed by John Gerout and myself. I'm David Bonner, and I'll be narrating this lesson. The Frequently Asked Questions, or FAQ, link located on the ncdr.com website in any given registry is located under Resources and FAQ and may be used as a resource for participants to submit a question to NCDR as well. It's important to note that not all incoming questions directed to NCDR via the FAQ tool are posted to the FAQ database. In each registry, an FAQ link houses a database of Frequently Asked Questions and Answers which have been designed to keep this database manageable. Frequently Asked Questions can be searched by data element, sequence number, data element name, or FAQ ID or by a text search for each keyword. The Full Specifications Data Dictionary is an essential resource for abstractors. In addition to providing data definitions and coding instructions for each unique element in the data set, technical specifications outline the parameters that the data must conform to, such as a valid range, unit of measure, or use of decimal for numeric values. Data Extract Functionality is provided for the ACC Online Data Collection Tool users as a means of reviewing and self-reporting raw data entered. Raw data has not necessarily been processed through the data quality process. The Data Extract Functionality allows the ACC Online Tool user to run reports in Excel spreadsheet format and provides customized reporting. Data Migration is defined as the process of moving data from the ACC Online Data Collection Tool to a third-party vendor tool. This process should only occur after an NCDR participant has completed the contract process with a third-party vendor. The purpose of data migration is to ensure that all data entered into the ACC Tool is consolidated into your new third-party vendor tool to centralize all data for submission. The use of the ACC Online Data Collection Tool simultaneously with a third-party vendor is strictly prohibited. Once data migration is complete, access to the ACC Online Data Collection Tool will be turned off. Reverse data migration from a third-party vendor to the ACC Online Data Collection is not possible. If an NCDR participant wishes to move back to the ACC Online Data Collection Tool, it's the responsibility of the NCDR participant to maintain data files for the complete time period the third-party vendor was used. The Call for Data Deadline is the final date and time for which data can be submitted to NCDR in order to be included in the Institutional Outcomes Reports and Benchmark Comparisons for a defined quarter. The deadline always occurs at 11-59-59 Eastern Time. The Call for Data Schedule is located at ncdr.com under Data and Call for Data Schedule. As a note, data submitted after the posted deadline will be eligible for display in the next published benchmark report. The Quality Check is a functionality within the ACC Data Collection Tool which provides an initial assessment of logic and accuracy for your data entry. The QC process identifies errors and outliers in data before it's sent to the Data Quality Review process and then harvested by NCDR. Every patient data collection form entered through the ACC Data Collection Tool must pass through the Quality Check before the DQR process will allow for data submission. Data Quality Report is a process for submitting data files to the NCDR. The data file from your institution is not complete until it's been submitted to and passed the DQR. All data submissions are evaluated for errors and completeness. This information is then summarized in an automated report and a submission status of green, yellow, or red is then returned to participants after each data submission. A green status indicates that the quarterly data submission process is complete. A yellow status indicates that the submission has passed the data assessment but failed the completeness assessment. And a red status indicates the data has validation errors. When a yellow or red submission is achieved, the participant is encouraged to review and correct any outstanding issues identified by the DQR and then resubmit the data to the DQR process. Passing the DQR with a green submission status ensures well-formed data and statistically significant submissions. DQR-based submission results include data captured prior to or during the episode of care of the patient. Individual registry review is necessary to determine the specific coding instructions defining admission to addition periods as well as registry-specific inclusion and exclusion criteria. The DQR process sorts patients into defined quarters based off of hospital discharge dates. For a complete review of data deadline schedules, please review the call for data schedule located at ncdr.com under Data and Call for Data Schedule. It's important to note that base submission data quality review must be completed before any follow-up submission. Follow-up data quality report submission results include data captured after the hospital discharge of the patient. Follow-up dates and ranges vary from registry to registry. For a complete review of follow-up schedules and deadline periods, please review the call for data schedule located at ncdr.com under Data and Call for Data Schedule. Again, please note that the base submission data quality review process must be completed before and separately from any and all follow-up data submissions. An outlier is a warning designed to inform the abstractor of a value which is falling outside of the usual range but still within the valid range for a specific element. Outlier warnings are helpful in assisting the abstractor to identify keystroke errors and decimal misplacements. Outlier warnings are generated by both the ACC Data Collection Tool through the Quality Check for those users that submit data through the ACC Online Data Collection Tool and by the Quality Review DQR process for all users. A benchmark is a point of reference against which hospital metrics are compared to volume group aggregates and total registry aggregates with passing statuses. Benchmark comparisons are available in the NCDR dashboards as well as in the registry-specific Institutional Outcomes Reports. Percentage values in the form of thresholds are assigned to every data element within a registry. When data is submitted to the data quality report process, the harvest mechanisms review every data element electronically to ensure the assigned element threshold has been met. If for example a threshold has been set to 90% on a sequence number called 1111 and out of 10 patient records the actual elements were completed 10 out of 10 times, the system will assign 100% passing status for that element. If the threshold has been set to 90% on our sequence 1111 and out of 10 patient records the actual elements were answered 8 out of 10 times, the system will assign a failing status for that element due to the 90% threshold not being met. In order to be included in benchmark comparison reporting, all thresholds must be met. The good news is that the DQR report will identify all failing elements for your review. Benchmark reports provide a quarterly or on-demand detailed analysis of hospital performance in relation to a specific facility, like-volume comparison groups, or the entire NCDR registry aggregate population. The NCDR dashboard is a centralized location providing users an analytical, streamlined tool to access your hospital's metric performance online. Features include at-a-glance view of your hospital's executive summary metric performance with the ability to drill down to view the patient-level detail report. It also provides the ability to trend facility's metric performance in relation to the registry-wide 50th percentile performance, the ability to perform comparative and trend analysis to a subset of registry hospitals in the comparative tool, and provides additional reports such as outcomes and public reporting. Benchmarking reports give insight into care variations and quality improvement opportunities and provide the opportunity to compare hospital practice patterns. Benchmark data is organized into a rolling four-quarter time frame, which helps define the dated content of outcomes reports and dashboard reporting. Each quarter of data is sorted by discharge date. R4Q refers to the most recent data quarter and the three quarters prior. As new data is sorted by quarter and added to the reporting cycle, the oldest data quarter will drop off of reporting. This continuous cycle creates a rolling four quarters. Registry metrics allow you to review and process the results of your hard work. Registry metrics and benchmarks are completed with green status aggregations and populate our registry dashboards with close-to-real-time data, as well as provide on-demand reporting options for certain registries. Metric benchmark reporting options are available for some registries based on a rolling four quarter process. The dashboard is a centralized location, providing users an analytical, streamlined tool to access your hospital's metric performance online. Features include at-a-glance view of your hospital's executive summary metric performance with the ability to drill down to view the patient-level detail report. For more information about our NCDR dashboard products, please be sure to navigate to our QII Learning Center and look for our dashboard lessons. This concludes our lesson, and thank you for your participation.
Video Summary
This video provides instructional information for the NCDR international community on onboarding education. The lesson covers several topics including Frequently Asked Questions (FAQ) for participants, data extraction functionality, data migration to third-party vendor tools, data submission deadlines, quality checks and reports, outliers and benchmarks, and the NCDR dashboard. The video emphasizes the importance of meeting thresholds for data elements in order to be included in benchmark comparison reporting. The NCDR dashboard is highlighted as a centralized location to access hospital metric performance. The video is narrated by David Bonner and the content was developed by John Gerout and David Bonner.
Keywords
NCDR international community
onboarding education
data extraction functionality
NCDR dashboard
hospital metric performance
×
Please select your language
1
English