Healthcare Data Quality Dictates Workflow

healthcare data quality

healthcare data qualityIt would seem to be an obvious statement – data quality impacts workflow. Yet, healthcare data quality is rife with issues. A distinct lack of data governance has haunted healthcare for decades. With technology available today, the quality of healthcare data should be in a much better state. Other industries are addressing these issues to make a positive impact on their business, retail, construction, marketing.

The many aspects of an organization’s workflow impacted by data quality is often underestimated. Healthcare facilities and vendors fail to address these basic challenges and create work arounds that don’t solve the real problem. When study descriptions are inaccurate or incomplete, radiologist worklists are negatively impacted. Hanging protocols fail repeatedly if the data quality is not consistent between study and series descriptions. Billing and coding workflow is hugely impacted if the data is inaccurate or incomplete. Organizations will spend an inordinate amount of time correcting data to be appropriately reimbursed. Additionally, data quality impacts the orchestration of workflow in determining where studies get routed. The post processing of 3D studies, studies for AI algorithms, revenue cycle management, and analysis of data are all impacted by data quality.

Healthcare Data Quality Today

Many vendors try to address workflow by working around data quality. Modality vendors send the data to the PACS and leave the responsibility of study and series descriptions to the PACS administrators. Each upgrade to the modality changes sequence descriptions, introduces new variables and breaks any data standards that may have existed. PACS vendors develop “smart” hanging protocols that require the radiologist or PACS Administrator to create appropriate hanging protocols. However, the PACS administrators are left to create multiple protocols for the same display because the variety in descriptions vary widely, resulting in poor data quality.

Data Quality Dimensions

There are a number of dimensions to data quality that must be considered in order to effectively impact workflow in a positive way. Many industries are dependent on the quality of their data to be superior as poor data will ruin their business. Think of how the finance industry would fare if they had a reputation for poor data quality. These dimensions need to be applied to healthcare to reimagine how healthcare is delivered.

accuracy of dataData accuracy is the most obvious factor that impacts workflow. However, various reports claim that Electronic Medical Records are frequently inaccurate. Surveyed patients have reported that they have found very serious mistakes in 10% of their records and 42% serious mistakes. Inaccurate data in radiology impacts data routing workflows and reimbursement rates. Additionally, if the data is inaccurate, it gets routed to the wrong place or hanging protocols don’t display correctly.

 

 

 

 

The completeness of data is commonly poor in healthcare, especially in radiology. An analysis performed by Enlitic of 2.3m studies across 8 facilities showed that 2.9m series were missing DICOM descriptions. This problem gets compounded when anonymization is applied, as many PACS vendors arbitrarily delete data without any consideration for the presence of PHI or not. Incomplete data impacts the workflows of radiologists’ reading lists and creates more denied reimbursements. A CT study that is acquired with and without contrast but was only ordered as a CT without contrast is incomplete when it gets to the billing department, thus requiring time to resolve the issues.

 

 

 

consistency of dataThe consistency of data requires some degree of data governance to be applied so that similar studies can be identified using the same parameters. Consistent naming of studies and series enables workflow orchestration – the routing of studies and series, appropriately. It means that series from a CT Head and a CT Head Trauma are displayed as desired, often in the same way as the acquisition parameters are the same. The impact on workflow of poor consistency is a major obstacle to hanging protocols working correctly or not.

 

 

 

 

uniqueness of dataIn some industries uniqueness is a desired concept. Finance needs to have unique identifiers so that money is directed to the right account. Uniqueness has its place in healthcare from two perspectives: We want data to be unique when it comes to a single patient, but we also want to know how that patient’s data is like other patients. The ability to search for a 42-year-old female of Hispanic descent with a positive lung nodule requires that all patients of this similar cohort are described in the same way. The uniqueness, or lack of, requires some degree of data governance to be applied so that similar studies can be identified using the same search parameters.

 

 

 

Artificial Intelligence Adoption Model

The artificial intelligence adoption model indicates that data is a major factor in the successful deployment of an AI algorithm. Healthcare data quality is key to this aspect of the data. Many organizations stall, or fail at deploying their AI solutions and encounter data quality issues 96% of the time. Deploying a AI powered data governance solution like Curie|ENDEX™ can correct inaccurate data, enrich data to ensure it is complete, create a data governance model that uses a standard nomenclature to build consistency, and identify the uniqueness and similarities of accuracy of what was performed.

Data quality should not be an issue in an industry where life and death decisions are made daily. Enlitic offers solutions that can solve these challenges. See how we can improve data quality with ENDEX 2-minute demo.

 

Share This Article:
Facebook
Twitter
LinkedIn
WhatsApp