Does everyone need big data? No. Very few can afford big data. It’s expensive, messy and hard to explore.
The promise of big data and artificial intelligence is enticing, to be sure. But many gloss over the most important task: compiling and prepping the data so the gremlins can explore and discover.
The task of matching, coding, translating, and normalizing data takes computing power, time and sophistication. Truth is, many spend more time on the prepping than on the actual analyses and use.
There is an alternative: smart data.
Slices of smart data support many decision-making processes and allow for greater efficiency without the mental burden of having to weed through hundreds of extraneous fields. And smart data offers tremendous value to many in healthcare when used strategically, without the headaches of a massive integration effort.
Encounter data alone provides valuable information on a patient’s story. The reader knows a patient’s care term, frequency and timing of interactions, and some level of clinical load. This small slice of smart data provides critical information to a decision maker and supports care coordination and care transitions.
And think about quality reporting, which generally requires more data depth including specific labs, medications and vital signs. This data strategy is still fairly targeted to specific disease states and values. Not every piece of data needs to be cleansed, normalized, mapped, and standardized. After integrating millions of records where the average number of labs per person was over 50, the quality metrics concentrated on 3 specific labs for certain patients, effectively reducing the need for data to less than 6% of the data provided.
So while we may inherently think that more data means more insights, unless you have the infrastructure, funding streams, and patience, I recommend a more strategic purpose-driven approach so you can realize gains quickly and focus on the information that really adds value. That’s the value of smart data over big data.