You will be redirected to the page you want to view in  seconds.

How to turn too much data into just enough information

Dec. 13, 2013 - 03:45AM   |  
By JEFFREY SORENSON   |   Comments
Jeff Sorenson is president and partner, A.T. Kearney Public Sector & Defense Services, and former Army CIO/G6.
Jeff Sorenson is president and partner, A.T. Kearney Public Sector & Defense Services, and former Army CIO/G6. (File)
  • Filed Under

The last post discussed best business practices that leading commercial companies implement to design and build a smart data haystack. Only when this haystack is in place is an organization ready to mine its data into actionable, predictive insights, which is the focus of this blog.

The commercial sector leads the way in mining big data. While not all commercial companies understand how best to mine their data, many are learning that better analytics can improve the bottom line through better sales and/or reduced logistics costs. Companies with world-class analytic capabilities are, for example, extracting insights about consumers on a near real-time basis, streamlining their logistics to more efficiently serve their customers, and optimizing their whole value chain to work more effectively with their channel partners.

Best-practice data analytics examples from the corporate sector can provide parallels for the DoD to examine to make improvements in its own intelligence data processing, as well as more efficient and effective deployment of its constrained intelligence resources. The DoD can also benefit from investigating private sector logistics improvements to drive down the cost of its supply chain operations.

The examples below demonstrate how the private sector improves its analytics by extracting hidden information deeply buried within its data. Like oil companies that use new fracking techniques to extract oil from old wells, companies are improving their analytic methodologies to extract better insights from their data haystack.

Mining Unstructured Data

A global communications company noticed that its customers were posting vast amounts of data about their experiences with its products and services on Twitter. Seeking to draw valuable insights from the data, the company required new data science techniques such as natural language processing, cognitive science and linguistics. With a new social media analytics framework and toolkit, it gathered more than 60 million tweets and structured them into a more useable format that identified key features and patterns. The company then analyzed these patterns to quantify consumer sentiment. Insights from this analysis enabled the company to develop an implementation roadmap of solutions that helped identify and prioritize several product improvement opportunities.

(Page 2 of 3)

This example offers two takeaways. First, the DoD and most other organizations already gather and store publically generated unstructured data. So this information is already available; it just needs to be mined and analyzed. Second, doing so requires new types of data techniques to provide better context and the needed insights to mine the unstructured data.

Developing Micro-Segments

A financial services organization also relied on new, smarter analytical techniques to get more information from available data. In this case, the firm used advanced predictive analytics to develop micro-segments of its client population so it could better identify distinct needs of specific customer groups and then more directly market to these needs.

What’s important here is that the organization combined several types of information including demographic, transactional, and psychographic—and then layered them with information about customer needs and product services. By combining these different data sets, the company was able to better understand the context behind the purchase of certain financial investment products in addition to typical generic facts about customer gender and age groups.

As a result, the company developed a more meaningful and actionable profile of its customers based on these micro-segments to improve product offerings and targeted marketing campaigns. The intelligence community would benefit from improved micro segmentation techniques to improve its understanding of the adversary and to develop better predictive analytics for the future.

Managing Inventory

When a large pharmacy retail chain embarked on an effort to improve its inventory management processes, it leveraged massive parallel processing techniques to develop a data model that rapidly identified overstocked inventory in some stores that was needed at other locations. This initiative, which was not constrained by the size of either the data or the product segment, enabled the company to efficiently model thousands of SKUs across 6,000 stores—more than 5TB of inventory data and millions of records each week—and enabled it to match demand anywhere within its network with inventory from another store. As a result, it reduced overstocked inventory by several hundreds of millions of dollars without negatively impacting sales or service levels.

(Page 3 of 3)

Such inventory management has a direct application to the DoD. Parts, major components, end products, and other inventory items procured by each respective service are sometimes located where they are not needed or are needed where they are not located. Like the pharmacy retailer, the DoD could significantly decrease the investment and size of its inventory with better data analytics.

Integrating the Value Chain

Challenged by the complexity of its supply chain, the frequently changing internal and external parameters, and the lack of visibility into the key choke points along the chain, a large consumer goods company used a customized dynamic allocation resource model to integrate the entire value chain from supplier to customer to realize planned procurement benefits and improve its ROI.

The analytic model gave the company a holistic view of its value chain costs, enabling it to optimize the allocation of resources (logistics and transportation), qualify different suppliers against the value chain, and build in redundancy where necessary. The tool also helped enable a more fact-based decision making culture through ongoing, cross-functional scenario planning that ensured distribution of this key product even during times of supply shortage, distribution route closings, currency fluctuations, and other disruptions.

The company identified tens of millions of dollars of incremental savings potential, which vastly exceeded expectations. Moreover, it is now among industry leaders in moving toward a value chain view of decision making, a view based on the ability to integrate data and then mine it smartly.

Companies have to find the right balance between capturing more data and extracting more insights from existing data. Key to achieving this balance is establishing a linkage between the business drivers needed to drive organizational decisions with the supporting analytics and, in turn, having data required for the analysis. Improving the ability to mine data into actionable, predictive insights requires the use of different algorithms, more sophisticated techniques, and other advanced tools to gain more intelligence from the existing data stream.

The next post will cover best-practice data governance, which along with a well- developed data haystack and excellent mining techniques, is necessary to leverage big data efficiently and effectively.


Start your day with a roundup of top defense news.

More Headlines



Login to This Week's Digital Edition

Subscribe for Print or Digital delivery today!

Exclusive Events Coverage

In-depth news and multimedia coverage of industry trade shows and conferences.



Defensenews TV

  • Sign-up to receive weekly email updates about Vago's guests and the topics they will discuss.