Search...

Type above and press Enter to search. Press Esc to cancel.

November 21, 2019 | 4 Mins Read

Sorting Through The Digital Noise

November 21, 2019 | 4 Mins Read

Sorting Through The Digital Noise

Share

By Greg Lush

As I was entering the workplace in 1981, computers were not so popular for a guy in the commercial/industrial service business. Memory, and I am talking about hard-drive space, was nothing like it is today. Remember that unconfirmed quote from Bill Gates in 1981, when the IBM personal computer was introduced? Mr. Gates supposedly said that 640Kb of memory "ought to be enough for anybody.” Heck, Apple just a few years later released a cute little all in one computer (1984) named Macintosh 128k, yes signifying the amount of hard-drive space. Now, it is not uncommon to see a handheld phone with 64Gb of memory, a jump from 128,000 bytes to 67,108,864 bytes. Interestingly, a solid-state drive today, at 64Gb is a fraction of the cost of a spinning hard-drive in 1981 with only 128Kb. The point is, memory is CHEAP; forget about your past practices and assumptions regarding what to save and what to get rid of, save it all! Your biggest challenge will be to sort through the noise; a digital exhaust of sorts.

Start by establishing what you will do with the data gathered. No different than other business decisions, you need to align your input, the collection of data, with your desired output, commonly visibility and/or action. Please do not be surprised if after this exercise you discover that over 90% of your information does not translate to an action, or any value, today. To illustrate let's take an air conditioning (AC) unit, just like the one that you have at your home or office. Not too long ago I was in a meeting with an AC unit manufacturer bragging about all of the data that they could obtain, over 120 different points of information. At first pass you may think, wow, that is impressive. Yet, as you peel back the business scenarios to keep the unit running trouble-free for as long as possible, a service person would tell you that only a handful of data points are required. Where does that leave us? Acquiring that information does have a connected cost, regardless of how commoditized the hard-drive storage market has become. We need to ask, transfer, process, store, and use the data in a meaningful and relevant manner. So, now you’re confused — gather the data or don't gather the data?

Yes, is the correct response to both questions. A flood of information will hit your environments and I encourage you to save each last byte. However, during the ingestion process tease out the data needed for the handful of relevant data points. Allow the remaining data to travel directly to a data lake, or some other inexpensive mass data storage tool. One of the many features of a data lake is we can store a TON of data for hardly any cost (seen in organizing, tagging and storage space). In order to perform this "in flight" routing of data you will need a system sophisticated enough to perform analysis while the data is in motion. The top tier software companies will all have some degree of functionality in this arena, Microsoft refers to this process as Stream Analytics. It is hard to verbalize the significance of this next step. Bear in mind that all your AI algorithms feeding machine learning and other tools will use the data set that you deliver. Sure, many AI tools can take the whole lot, yet getting the most efficient package creates tremendous processing and financial benefits. Cloud platforms charge based on several variables, including yet not limited to; processor horsepower, memory (hard), memory (computer). Design for the objective and outcome. When you follow this path, you will also put the business need ahead of the technology requirement, which leads to greater adoption and overall user satisfaction.

Now, for one of the most misunderstood topics — predictive analytics and tools. It all starts with listening and understanding your client’s business. Once you understand the customer's objective(s), you need to combine that with your own business objectives. Is the reason that you are using predictive modeling is to give you an edge in the market? If that is the case than the models that you choose should be single models, possibly only looking at one element and having the right to claim, rightly so, that you are predicting an outcome. However, if you are trying to turn around or mitigate risk within your organization, you may need to look at multiple pieces of equipment and their predictive models to see how, when combined with one another, create different perspectives which may alter your course of action. These are very different approaches, they all need predictive models and data, certainly investment, but their level of sophistication is vastly different. Both are valuable, the bottom line will be how well aligned the model is to your business conditions and environment.

Learning models, and the algorithms contained within, can drive incredible value to your business. It is key that you understand how the outputs of these data science-based objects influence action within your current operating environments. Keep in mind that IIoT, data sciences, and even workforce sciences, are as much about the tools as they are about the cultural and market-based changes required to truly evolve your operation.