
By Daryl Crockett
More and more, I find myself yelling at the TV. How can our government officials and world-class medical facilities be making such big mistakes? How can there be such a breakdown in information? As a Quality professional, I can see there are both discrete and systemic quality problems at the root of all of this. But as I thought more deeply, I was surprised by the relevance to my life as a Data Quality professional.
When Thomas Eric Duncan, entered Texas Health Presbyterian Hospital in Dallas for the first time, exhibiting flu-like symptoms, he was misdiagnosed due to an internal breakdown in communication between members of the hospital’s triage team who failed to make the West African connection. We have since learned that hospital staff had a pre-formulated quality triage process in place at that time, but for some reason, the human beings just didn’t follow their processes as designed. We can all accept that human beings are not perfect and do make mistakes.
But the second time Duncan entered that same hospital, it was frighteningly obvious that Ebola was now a new resident in the Don’t Tread On Me State. Presbyterian Hospital staff whipped out their standard infection disease protocols and did their best to treat Mr. Duncan and to protect themselves and others from the deadly virus. Those protocols contained data that had been developed with due care and consideration of the facts and threats known at that time. The data in those protocols met the requirements of the hospital quality system at the time it was created and approved.
As events unfolded, two Dallas health care workers were subsequently infected with the Ebola virus, and an entire Caribbean cruise ship carrying a potentially exposed Dallas healthcare worker was turned away from vacation ports of call. It became painfully obvious that while the hospital’s infectious disease protocol data met their data quality requirements at one time, it was NOT quality information! As ECCMA Executive Director, Peter R. Benson, instructs – “Data is useful, but information is timely, relevant, and accurate.”
The Presbyterian Hospital, the CDC, and even the World Health Organization were not working with the best information available – that which was developed in the field hospitals of West Africa run by Doctors Without Borders.
The Ebola crisis has given me a much better way to articulate this ‘Information vs. Data’ challenge which is a root cause of a lot of the data quality problems I see within client organizations. In today’s world, companies gather and process great volumes of data. Bigger and faster data is driving more and more critical decisions. Business teams typically create Data Quality rules, and their IT teams will program and execute ETL and data quality scripts to help the organization comply with those rules. Leadership assumes that since their data meets their internal data quality standards, then they are working with “good data”. Perhaps, the better question an organization should ask is, “Are we working with the right information?”
Because in most cases, independent 3rd party validation and comparison to external information exchanges can provide a potentially higher quality source of information for corporate dictionaries, customer and vendor data, material specifications, etc. – Information which is more timely, relevant and accurate than that being used by the business today.
Data Quality is a worthy business practice, but by itself, it is insufficient. Adding periodic Data Validation performed by those independent to the core IT team, can offer the best possible information quality and can help lower an organization’s exposure to data risk.
More and more, I find myself yelling at the TV. How can our government officials and world-class medical facilities be making such big mistakes? How can there be such a breakdown in information? As a Quality professional, I can see there are both discrete and systemic quality problems at the root of all of this. But as I thought more deeply, I was surprised by the relevance to my life as a Data Quality professional.
When Thomas Eric Duncan, entered Texas Health Presbyterian Hospital in Dallas for the first time, exhibiting flu-like symptoms, he was misdiagnosed due to an internal breakdown in communication between members of the hospital’s triage team who failed to make the West African connection. We have since learned that hospital staff had a pre-formulated quality triage process in place at that time, but for some reason, the human beings just didn’t follow their processes as designed. We can all accept that human beings are not perfect and do make mistakes.
But the second time Duncan entered that same hospital, it was frighteningly obvious that Ebola was now a new resident in the Don’t Tread On Me State. Presbyterian Hospital staff whipped out their standard infection disease protocols and did their best to treat Mr. Duncan and to protect themselves and others from the deadly virus. Those protocols contained data that had been developed with due care and consideration of the facts and threats known at that time. The data in those protocols met the requirements of the hospital quality system at the time it was created and approved.
As events unfolded, two Dallas health care workers were subsequently infected with the Ebola virus, and an entire Caribbean cruise ship carrying a potentially exposed Dallas healthcare worker was turned away from vacation ports of call. It became painfully obvious that while the hospital’s infectious disease protocol data met their data quality requirements at one time, it was NOT quality information! As ECCMA Executive Director, Peter R. Benson, instructs – “Data is useful, but information is timely, relevant, and accurate.”
The Presbyterian Hospital, the CDC, and even the World Health Organization were not working with the best information available – that which was developed in the field hospitals of West Africa run by Doctors Without Borders.
The Ebola crisis has given me a much better way to articulate this ‘Information vs. Data’ challenge which is a root cause of a lot of the data quality problems I see within client organizations. In today’s world, companies gather and process great volumes of data. Bigger and faster data is driving more and more critical decisions. Business teams typically create Data Quality rules, and their IT teams will program and execute ETL and data quality scripts to help the organization comply with those rules. Leadership assumes that since their data meets their internal data quality standards, then they are working with “good data”. Perhaps, the better question an organization should ask is, “Are we working with the right information?”
Because in most cases, independent 3rd party validation and comparison to external information exchanges can provide a potentially higher quality source of information for corporate dictionaries, customer and vendor data, material specifications, etc. – Information which is more timely, relevant and accurate than that being used by the business today.
Data Quality is a worthy business practice, but by itself, it is insufficient. Adding periodic Data Validation performed by those independent to the core IT team, can offer the best possible information quality and can help lower an organization’s exposure to data risk.