After his immensely popular, 3 part, Internet of Things article, Steve Brady offers his latest thoughts on Big Data and the importance of understanding its role in operations today.
“Big Data” can mean different things to different people. In fact, what constitutes “big data” can often simply be a reflection of what you have historically been able to analyze. What is constant across all definitions is the idea that data is critical to understanding--understanding your operations, understanding your customers, and understanding your environment. In fact, Microsoft’s new CEO Satya Nadella tweeted “Data Is Our Most Precious Natural Resource. Our Task Is To Refine Data Into The Fuel For More Personal Experiences"
As I presented at the eft 3PL Summit in June, data can be used to describe the world around us, but the ability to transform that data into information allows us to actually manage that world. And of course, by understanding the information and seeking the causal relationships between actions we can truly hope to change the world.
There have been two limitations to any analysis of data--the amount you can store, and the tools to analyze it. (Of course, there is the pesky problem of the skillsets of the analysts, but let’s set that aside for the moment.)
We have come a long way in terms of storage. In fact, what was considered a lot of data storage just 10 years ago seems trivial in comparison. Take for instance hard drive storage. At the start of 2004 you could buy a 120 GB drive for $111. As of April of this year you could purchase a 3 TB drive for about $110. (Source: http://www.jcmit.com/diskprice.htm). In just 10 years you can purchase 25 times the amount of storage for the same amount of money. This data reflects the prices at the time--with inflation, the actual purchase price is even lower in real buying power.
Of course, with the increasing storage space we have created more and more data to fill that space. Whether we are talking about our own digital photography and music (higher resolution and higher fidelity files take up more space) or our businesses collecting more data for processes, we find ourselves swimming in bits and bytes.
Operations today rely on many sources of data. Some of that data is from internal sources such as production line sensors, RFID readers tracking items through a facility, point of sale (POS) devices, and data entered manually by employees. Other data critical to our success comes from outside sources including real time weather and traffic updates, delivery tracking data, and economic data. With the growth of the internet, and the explosion in storage capacity, nearly any business can now access vast amounts of data and store it locally (or--in the cloud!)
How can we possibly wrap our heads around all this data?
If you pay attention to the hype surrounding big data, and the sales teams pushing their solutions, the answer seems clear--you must spend large sums of money on solutions designed to manipulate extremely large datasets and tease out meanings from the data.
But what are you to do if you don’t have hundreds of thousands of dollars laying around? Is this another “big dogs only” game?
Thankfully the answer once again is “No!”
Microsoft has provided some very powerful tools even in Excel that can let you manipulate large amounts of data and derive knowledge from that data. If you have Microsoft Office 2013 Professional Plus or Office 365 ProPlus you can enable “PowerPivot.” (If you have the earlier 2010 version of the Professional products you can download and install it.)
PowerPivot extends the capabilities we have enjoyed in Excel’s Pivot Tables for years by providing a couple critical features.
First, you can manipulate datasets far larger than a standard Excel spreadsheet can handle. Excel can handle 1,048,576 rows by 16,384 columns (or over 17 BILLION) cells of data!) I am sure most of us never thought we would ever approach the limits of Excel in our lifetimes and yet, here we are. We may not have 17 billion cells of data, but with many of our automated data collection systems we can easily cross the 1 million rows (records) threshold. PowerPivot stores all the data in a different system and thus allows for use of 2GB files and up to 4GB of data in memory. Now THAT should last a lifetime (wait… I remember this feeling….)
Another powerful feature is the ability to import data from a wide range of sources. Not only can you use data from Excel, you can pull data from …”different sources, such as large corporate databases, public data feeds, spreadsheets, and text files on your computer.” (source: Microsoft) Using this feature you can now begin to connect disparate datasets, bringing together and correlating data from multiple sources--allowing the first steps of the transformation from data to information!
One final feature I wanted to highlight is the ability to build complex calculations from the data. By manipulating the different data sets you can create new columns that become useful as the underpinnings of “Key Performance Indicators.”
This one tool, included in Microsoft’s current professional offerings of MS Office, provides powerful data manipulation and presentation opportunities at no additional cost. If a business wants to begin to see what benefits there are to big data, they no longer need to fear the cost.
Even the small dogs can play!
Next article: How can we clean up the data?
The lack of predictability in processes is a major "margin killer". What is causing the unpredictability in your organisation? And how can planners improve on the degree of predictability?
Mick Jones, Vice President Global Logistics, Supply Chain Strategy & Network Transformation at Lenovo took part in a panel at eft’s Hi-tech Supply Chain Summit in Amsterdam, have a read at his takeaways.
Last Thursday eft launched the European 3PL Summit in logistics hub, Venlo.