Christopher Surdak, a former senior manager for storage technologies for Accenture, is an expert in collaboration and content management, information security, regulatory compliance, and cloud computing with more than 20 years of experience. This excerpt is from his new book, Data Crush: How the Information Tidal Wave is Driving New Business Opportunities.
Data Efficiency: The New Watch Phrase for IT
The argument for placing old data on tape rather than disk is based on the fact that the tape is less expensive by perhaps a factor of three or four. However, this calculation completely discounts the potential value of the data if it were available for analysis. Given that data analysis is quickly becoming one of the key differentiators between successful and unsuccessful companies, I believe that the use of tape for the retention of data in offline archives is going to be recognized as financially unacceptable, if not irresponsible.
In addition, the majority of companies do not emphasize efficiency with the information storage systems that they build and manage. As is typical with the computer industry, there tends to be a great deal of focus on the performance of information storage resources, that is, how fast they move information in or out of storage so that it may be used. This focus typically overrides the concern over how efficiently the information is being stored when it is not in use, which is the vast majority of the time.
There are a range of tools and techniques available to improve data storage efficiency. Examples include:
- Deduplication: The automated removal of multiple copies of the same file
- Data compression: The removal of redundant information within a file in order to reduce its size
- Thin provisioning of storage resources: Allocating only those resources that are actually being used, rather than reserving more space than you need at that time
I will forego a deep technical discussion of each of these tools, as they are well covered by other authors. But, given the rates of information growth that we are all certain to experience in the near future, it is absolutely imperative that companies use every tool at their disposal to reduce the volume of data that they are storing and to maximize the efficiency of data storage, rather than just the accessibility of data.
Quantification: Every Aspect of Business Data Enabled and Data Governed
What should be clear from this discussion about data availability is that it is imperative that companies capture and make available to the business all of the data that may help to characterize its performance. This requires two key steps. First, meaningful metrics for every business process in the company must be identified and captured. In particular, those business outcomes and processes that most closely align to your business strategy must be quantified through operational data, so that those outcomes can be monitored and optimized. Once you have ensured that these key business metrics are being captured, you must then guarantee that these data are readily available for analysis.
I use the term quantification to describe the trend toward data enablement of all aspects of a business’s operations. Quantification is the application of thingification, contextification, and Big Data analytics to deeply quantify the ins and outs of how a business process is functioning. The purpose of quantification is to use these new tools and processes to automate not only the operation of your business processes, but to also automate the ongoing evolution of those same processes.
Certainly, companies have been collecting data on their operations for several decades. However, the merging of transactional data with unstructured, collaborative data is allowing a much deeper insight into how companies really operate and how customers use a company’s products to create their own value. Further, the application of the statistical techniques applied through Big Data leads to still deeper understanding of both your business and your customer’s behavior. Through quantification, a company can quantify this value delivery, and thereby reduce the cost of delivering this value and driving profitability for its operations.
If you set the goal of reducing the cycle time of all of your business processes by half every eighteen months, then you will need all of this process data to confirm that you are meeting this goal. Quantification is a process that many companies have already gone through, as it is a key aspect of deploying enterprise systems, such as Enterprise Resource Planning (ERP).
However, in nearly all companies there remain dozens if not hundreds of additional business processes that are undefined and unmonitored from a data perspective. As long as these processes are not quantified, they will likely form bottlenecks in efforts to accelerate business. Quantifying these additional processes will facilitate business acceleration and will allow for fact-based decision making, which furthers a company’s ability to respond to the ever-changing business climate.
Excerpted from Data Crush: How the Information Tidal Wave Is Driving New Business Opportunities
© 2014 Christopher Surdak
All rights reserved.
Published by AMACOM Books
Division of American Management Association
1601 Broadway, New York, NY 10019
Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.
Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.