With 2.5 quintillion bytes of data created every day, and the fact that 90 percent of the world’s data has been created within the past two years, it’s hard to imagine what efficient data management will look like in the next few years, or how anyone can expect to keep up with the deluge.
According to a 2013 Forbes magazine blog, John Worstall called Excel “the most dangerous software on the planet.” An inflammatory, probably hyperbolic accusation? Maybe, but Worstall lists several reasons why the use of spreadsheets alone, such as those created by Excel, can indeed be dangerous approaches.
Other respected sources have warned about organizations’ reliance on spreadsheets alone as well to manage critical data: “Financial institutions need to take full control of their vast number of spreadsheets and databases if they want to be fit for the new era of post-crisis risk management,” warns leading data management specialist Cluster Seven. (A focus on improved risk management has emerged in the new IATF 16949 standard as well.)
Netflix executives are apparently taking this kind of control, as they rely on data for millions of customers to make predictions about who wants to watch specific media offerings. As an example of Big Data, this use reflects the ways in which decisions are made in the midst of an explosion of digital information that is now available and insistent in its call.
How, then, can an organization not only manage, but tame and use available data in order to render it useful rather than simply overwhelming?
Clearly, data management systems must go way beyond the use of Excel. Savvy organizations understand that taming data demands careful analysis. Data flows into spreadsheets and database tables at an astonishing rate. It is stacking up in personal computers, laptops, tablet computers, database servers, in the cloud, and in smartphones. In these growing mountains of data there is insight to be found – insight that can lead to gains in quality and ultimately in profitability. The rate and volume of data that passes through our lives every day demands systematic development of strategies for handling the task of taming it – strategies that are critical to making any improvement in systems. Without systematic approaches to data use, managers may arbitrarily impose improvement projects that will ultimately fail, largely because data related to that project is not collected and analyzed.
Organizations that have been in the business of improving processes and products have developed best practices for collecting the right data at each step of the improvement process, rather than letting the data simply pile up throughout the organization.
Before undertaking improvement efforts, it is important to establish the aim of data collection, and to understand what one hopes to achieve by examining the data related to a system.
Understanding of/Commitment to Product Quality and Customer Satisfaction
Best business practices begin with an understanding of, and commitment to, product quality and customer satisfaction. Good business management practices are grounded in continuous improvement and data-driven decision making. While this truism may not immediately seem to relate to the idea of data management, the two are nonetheless intimately connected. Let’s look at some best practices for taming and managing data so that it provides information, rather than simply piling up around you.
Continuous Improvement and Data-Driven Decision-making: Understanding the Current System
Before looking at data, the first step has to be one of understanding the current system. Using operational definitions of processes, understanding the system entirely (for example, a sales system), and gathering data to understand how the system is currently working will give a baseline perspective that will allow you to assess whether any changes have resulted in improvement.
Collecting Data to Define the Current System
Data plays a key role in assessing that system. Control charts, cause-and-effect diagrams and other tools will expand your understanding of the system. You may want to increase sales, and you are aware of the figures associated with revenues and expenses. Do you know exactly where sales come from, what market focus you may need to consider and the effectiveness of specific sales strategies? Collecting data that will help you define the current system is essential before embarking on any plan to enhance or improve.
Developing an Analytical Approach to Improving the System
The data that has been collected and the tools that have been used to fully understand a system and assessing its needs will be useful in developing an analytical approach to improving that system. Tools that support development potential theory of improvement will ultimately assure that the theory is on the right track and can be accurately tested. Without such tools, any improvement theory is simply conjecture. Again, tools for data analysis – control charts, Pareto diagrams, cause-and-effect diagrams and others – help to give the sand castle form, so to speak.
Using Data to Test Theories About Improving the System
Testing that improvement theory involves continued reliance on data – and not just any data, but that which will contribute to an understanding of the ways the theory is working, and whether it will actually generate results that are sought. Examining data will determine whether the theory should be abandoned or standardized as practice in the organization. Without that data, this process resorts to whim or ego.
Ongoing vs. One-Off Approach to Improvement
Full implementation of improvements is not a binomial process of it-works-or-it-doesn’t. Continued reliance on data collection and analysis will not only assure that the improvements are sound, but will also give information about additional ongoing improvement that may be required.
Putting data to work in order to assure that it contributes to an outcome, rather than collecting it with no attention to its specific usefulness, assures results, eliminates waste, and saves an abundance of resources in the organization. While doing this demands vigilance with respect to the kind of data that is collected and the tools used to analyze it, it will give structure to the data collection itself and render it eminently ordered.
Sort of like planning a sand castle before filling the first bucket.