There’s not much good in collecting data if you can’t do anything with it. We hear about countless examples of sensors gathering data and yet nothing productive happens as a result. One expert we spoke recently said that data is useless unless it is connected. So the question becomes – what is the best way to connect the data and in turn, interpret that data in real time to be able to make a decision?
The article below takes in depth look at the four Internet of Things computing options in use today. The author details not only how each type functions, but also shares his point of view on the good, the bad and the ugly. Furthermore, the piece describes what compute type is best for which applications. Where I agree with the author is that better MIST computing models will emerge in short order and will solve many of the issues that edge computing and fog computing cannot.
From a practitioner’s point of view, I routinely see the need for computing to be more available and distributed. When I started integrating IoT with OT and IT systems, the first problem I faced was the sheer amount of data that devices sent to our servers. I was working in a plant automation scenario where we had 400 sensors integrated, and these sensors were sending 3 data points after every 1 second.
You’ve probably heard about this before, but most of the sensory data produced is completely useless after the 5 seconds it was generated. Now you see my point?
We had 400 sensors, multiple gateways, multiple processes, and multiple systems that needed to process this data almost instantaneously.
At that time, most proponents of data processing were advocating for the Cloud model, where you should always send something to the cloud. That’s the first type of IoT computing foundation as well.