In my opening remarks, I recounted what I believe to be seven basic principles that underpin the IoT. I have derived these conclusions and observations over the last four years while working with end users and customers deploying a variety of Industrial IoT solutions. I’ll detail these seven principles in subsequent blogs, and here is a summary:
1. Big Analog Data
Analog data represents the natural and physical world and is everywhere or in other words is part of everything; light, sound, temperature, voltage, radio signals, moisture, vibration, velocity, wind, motion, video, acceleration, particulates, magnetism, current, pressure, time, and location. It’s the oldest, fastest, and biggest of all big data, but it represents an IT challenge in that it has more than two values that digital data has.
Simply put, in many ways analog data needs to be treated differently than digital data. The question is, and will continue to be, how can we efficiently unlock the business value of Big Analog Data?
2. Perpetual Connectivity
The IoT is always connected, always on, and that “Perpetual Connectivity” to products and users affords three key benefits:
- Monitor: Continuous monitoring which provides ongoing and real-time knowledge of the condition and usage of a product or user in a market or industrial setting.
- Maintain: Due to continual monitoring one can now push upgrades, fixes, patches, and management as needed.
- Motivate: Constant and ongoing connection to consumers or workers gives organizations a way to compel or motivate others to take some action, purchase a product, etc.
I refer to these as the Three M’s, and the notion that an organization can be perpetually connected to consumers and products is quite profound, with far-reaching implications and opportunities. For example, if your washing machine was connected to the IoT, predictive analytics could sense when the machine would fail and schedule a repair, say, ten days before that unfortunate event occurred. This way you’re not standing in front of a defunct washer holding a basket of dirty laundry.
3. Really Real Time
The definition of real time differs from people who don’t understand the IoT than from people who do. Real time actually begins back at the sensor or the moment the data is acquired. Real time for the IoT does not begin when the data hits a network switch or computer system – by then it's too old. If you want to know if your house is going to catch on fire, how soon would you like to know that? Or if and when a crime may occur, mere seconds are crucial. Hence an alarm must go off in very real time, before the data even gets to the cloud or data center, or it doesn’t help.
The point is, we’re seeking to blend the world of operational technology (OT), sensors, and data measurement with the world of IT. The IoT blends these two worlds for the first time in a major way, and the results will be profound.
4. The Spectrum of Insight
The “Spectrum of Insight” derived from IoT data relates to its place in a five phase data flow: real time, in motion, early life, at rest, and archive. Recall real time for the IoT at the sensor or point of acquisition and analytics are needed to determine the immediate response of a control system and adjust accordingly, such as in military applications or precision robotics. At the other end of the spectrum, archived data in the data center or cloud can be retrieved for comparative analysis against newer, in-motion data, to gain insight into the seasonal behavior of an electrical power generating turbine, for example. Hence insight from the big data in the IoT can be extracted across a spectrum of time and location.
5. Immediacy Versus Depth
With today’s traditional computer and IoT solutions, there’s a trade-off between speed and depth. That is, one can get immediate “Time-to-Insight” on a rudimentary analytic such as a temperature comparison or fast Fourier transform to determine if rotating wheels on a tram will cause a life threating accident. Immediate Time-to-Insight is crucial here.
On the other end of the spectrum is the time required to gain deep insight. The example here is from one of my former customers, the Large Hadron Collider at CERN in Europe, where they smash subatomic particles together to seek insight into the make-up of such particles. The data collected here takes a long time to analyze, using large, back-end computer farms. Such depth of insight has resulted in the recent discovery of a new subatomic particle called the Higgs Boson.
6. Shift Left
Consider the mutually exclusive objective of deriving both immediate and deep insight, as discussed in #5 above. It’s really hard to get both today. However, engineers are good at resolving conflicting objectives and getting BOTH. James Collins has referred this phenomenon as “the genius of the AND”.
The drive to get both immediate and deep insight from data will cause sophisticated high end compute and data analytics that is normally reserved for the cloud or data center (what I call Tier 4 in the IoT solution), to migrate toward the left of the end-to-end IoT solution infrastructure. That is, deep compute will be positioned closer to the source of data, at the point of data acquisition and accumulation in sensors (what I call Tier 1) and network gateways (Tier 2).
7. The Next ‘V’
Big data is commonly characterized by the infamous “V’s” --- Volume, Velocity, Variety, and Value. I propose a fifth “V” -- Visibility. When the data is collected, data scientists around the world should be able to see and work with it, as needed. Visibility refers to the benefit afforded by not having to transfer large amounts of data to remote people or locations. I love this idea of access to data and app “independent of time and place”. Mark Templeton, CEO of Citrix, adds a third independence: “independence of device”. My team and I are working closely with our partner Citrix as we deploy time, place, and device independent “Visibility” solutions.