Home
  • >
  • Opinion
  • >
  • Streaming analytics - key to leveraging M2M, IOT data
Read time: 3 minutes

Streaming analytics - key to leveraging M2M, IOT data

By , ITWeb
05 Aug 2019

Streaming analytics - key to leveraging M2M, IOT data

The Internet of Things (IOT) and Machine to Machine (M2M) communications are by no means new concepts, having existed for many years already. What is new and revolutionary is what we are now able to do with the data these technologies generate.

From predictive maintenance to consumer behaviour analytics, facial recognition to fraud detection and prevention, there is a wealth of value to be gained. However, these applications require a shift in mindset, from storing all data for retrospective analytics to analysing data on the fly and then dumping it.

This new paradigm of streaming analytics is key to leveraging value from M2M and IOT data.

The genuinely revolutionary aspect of IOT and M2M is its ability to generate always-on data. Human-generated data is and always has been finite. While the volume has increased over the years, it is still subject to limits that sensors and machines simply do not have - these objects can generate thousands of data objects every minute, and they are inexhaustible sources.

While this means that there are now limitless sources of data for analysis, it also means that unless something changes, we have to find a way of storing infinite data volumes. This is just not possible even if it were financially viable.

The constant and relentless nature of 'always-on' data generation also makes real time analytics even more important. Much of this data is only relevant now, in the moment, and once it is historical it no longer has worth. It must be analysed immediately and then deleted, otherwise no value can be gained. For example, a machine sending a signal every 30 seconds communicating that its status is good, is important information.

However, it does not need to be stored because it is worthless after the signal has been sent. It also only requires action if something changes, which can only be ascertained if this data is being analysed in real-time.

Without this instantaneous and continuous analysis, critical information might be missed. This is driving the emergence of the streaming analytics stack, enabling organisations to analyse data on the fly without storing it. However, this technology is still in its infancy, and while a number of open source technologies exist, such as Apache Kafka, they are by no means enterprise-ready. There are no audit trails or governance and no disaster or error recovery protocols.

In order to be of use in an enterprise setting, streaming analytics needs some sort of failover. This is critical so that it can still be available for analytics in the event of, for example, a network outage. Innovative commercial solutions are essential when it comes to solving problems that are only just beginning to emerge.

While the need for such solutions has emerged from the growth of IOT and M2M data, the same real-time analytics platforms have far broader business application. They can be leveraged to great value in any analytics scenario for enhanced processing speed, flexibility, agility and more.

Real-time decision-making ability is the future of analytics and we need to gear up for this or risk being left behind.

By Gary Allemann, Managing Director of Master Data Management.

Daily newsletter