data hoarder
Feature

4 Reasons Why Data Quality Trumps Data Quantity

5 minute read
Yaniv Mor avatar
SAVED

Data is a central part of smart business decisions today, so it can be tempting to think it’s important to collect everything and anything when it comes to data. But gathering huge amounts of information isn’t always the right strategy when mining for insights that truly matter.

The key to actionable business intelligence — the kind of data insights that provide real value in making decisions about how to run your company — is having the right kind of data, not having massive volumes of data.

In this era when tracking and collecting data is more common than ever before, here are four reasons why it’s arguably better to focus on data quality over data quantity. 

1. Too Much Data Takes Too Much Time

One-third of data pros spend up to 90 percent of their time cleaning raw data for analytics. This is a huge problem for data specialists, who are hired for their technical skills, not to serve as so-called data janitors.

Keeping massive amount of surplus data bogs down data workers and widens the “time-to-insights” window significantly, which has a direct negative impact on business performance.

Rather than investing precious time on cleaning up data, businesses need to reign in data collection, taking the time to analyze what components are actually needed to form insights, and then adjusting systems accordingly. Data intelligence is predicated on efficiency and agility.

Wasting time cleaning up a sloppy data collection process only hinders the data insights team, hurting their ability to do their jobs properly.

Granted, data preparation on any level requires the appropriate amount of time and energy. But by simply hoarding mountains of unneeded data and then having to wade through it all, time spent on data preparation multiples significantly.

2. Too Much Data Is Costly

IT infrastructure and operation costs are already a huge chunk of enterprise spending, representing 60 percent to 70 percent of a typical enterprise IT budget. And collecting an endless stream of meaningless data will only cause this price tag to rise. Moreover, in 2014 it was estimated that companies spend “$50 billion a year on too much data.”

The reason for the high price of too much data is simple: It costs money for the infrastructure of data storage, maintenance of data, data migration and more.

Learning Opportunities

The greater the volume, the greater the cost. While it’s true there are more and more cost-effective cloud storage solutions cropping up, that’s hardly a reason to continue hoarding data. Just because you’re spending less money storing junk, doesn’t mean that money still isn’t being wasted — and could be better spent elsewhere.

3. Too Much Data Creates Data Pollution

I recently heard the perfect characterization of a data lake gone bad with too much unnecessary data: the “data swamp.” Bulk collection of data might sound smart, but businesses often fail to account how this affects the overall quality of the larger data pool and its ability to yield valuable insights

One of the great things about mining data properly is that once you are done, you are presented with clean, reliable insights. When you blindly collect data, pulling in anything and everything, the quality of your data pool deteriorates and results in suspect data from which to make decisions.

Place of origin, time of origin, time of interaction — these are all pivotal components of data collection and data driven decision-making. However, when you are faced an infinite pool of these data points, these “need to knows” can become overwhelming, confusing and can contribute directly to creating a murky storage environment when too much data is in the water.

4. Too Much Data Makes Integration A Nightmare

Doing business today inherently requires having to cope with large amount of data, both internal and external. Add to the mix the need to do so under deadline pressure, and most data teams are already grappling with how to deploy scalable, impactful solutions to data integration.

When integration involves massive amounts of data — much of it extraneous and unnecessary — the process becomes slower, less wieldy and creates more problems that need solving along the way. All of this dilutes the effectiveness of a data integration project.

Poorly integrated data is proving increasingly costly as well. For example, shoddy integration of healthcare data results in $342 billion in lost benefits each year as providers try to deal with disparate data sets and sources. Add in useless information that is “gathered just for the sake of gathering,” and business costs can easily spiral out of control.

Don't Hoard

Whatever benefits businesses may think they are getting from hoarding data away for unknown periods of time, the negatives are equally, if not more daunting.

Quality data, not just any data, is pivotal to business success.  And streamlining your data approach now can save you a lot of time, money and effort in the long run.

About the Author

Yaniv Mor

Yaniv Mor is founder and CEO of Xplenty, a data integration cloud service. He is an entrepreneur, data and business intelligence (BI) professional, a husband and a father of three. Connect with Yaniv Mor: