From Asynchronous to Real-Time Communication: How Is the Process Industry Getting Prepared for the Next Level IoT?

Vivart Kapoor
September 16, 2019

 

Automation engineering specially in the process industry has always been a challenging area. The rapidly changing market environment demands higher productivity in the factories at lower (or ideally no) risks. Introduction of industrial internet of thing demonstrated how one can achieve higher productivity in a factory environment by collecting the sensor data and analyzing it in the cloud. This, in turn, gave birth to other productivity enhancement tools like predictive maintenance, asset monitoring, and tracking, digital twin and so on.

Even though the first results of the implemented factory IoT are quite promising, it is still only the tip of an iceberg. McKinsey article of digital manufacturing1 shows that the manufacturing industry produces more data than any other sector. Most of which goes unfortunately unused. It is not only due to lack of awareness but also due to the limitation of the current Operational Technology (OT) and Information Technology (IT) infrastructure. Two major issues the process industry facing at OT and IT level are:

  • Slow data transmission at sensor level: The commonly used field communication protocols at sensor level offers data rate (HART: 1.2 kbps, Profibus PA: 32.5 kbps FF: 1Mbps) which is not sufficient for high-end IoT application. The choice of using other non-bus powered Ethernet-based protocols is not available since they are not suited for Ex-Zone applications.
  • High latency and unmanaged data traffic: Quality high bitrate data transmission requires high bandwidth, low latency and quality of service. Moreover, in a complex system with several layers of critical, semi-critical and non-critical applications there is a need for an efficient data traffic handling. The commonly used industrial Ethernet-based protocols (Ethernet/IP, Profinet, MODBUS TCP) cannot handle this kind of complexity. These protocols do not offer priority scheduling of time-critical data.

Is There a Solution?

“Need is the mother of invention” –  If data is the key to good decision making then it must be abundant and need to move fast from the factory up to the cloud level. Fast and reliable data transmission is the prerequisite for machine learning and Artificial Intelligence (AI) implementations. Certain big player, societies and organizations came together to work upon new standards which can make OT, OT-IT and IT level fit for the mentioned task. These are briefly reviewed in the following.

OT Level: Advance Physical Layer (APL)

A joint initiative of various Fieldbus organization and IEEE, APL is protocol neutral, two-wire, a loop-powered physical layer for industrial ethernet which comes with a data transmission rate of 10 Mbit/s full-duplex. It offers a maximum trunk length of 1000m thus eliminating the need of repeaters in big plants.  The main feature of APL which makes it well suited for the process industry is that it will allow the powering of the sensor over the two-wire line (over APL field switches) which is specially designed for installation and operation in a hazardous area.

APL will overcome the shorting of existing sensor level Fieldbus protocols in terms of speed and compatibility. Introduction of APL will facilitate quick and efficient data transfer between the sensor and the PLC layer, with minimal hardware modification and retrofit stress.

OT to IT Level: Time Sensitive Network (TSN)

The need for low latency, high bandwidth and highly deterministic communication within complex system gave birth to TSN. TSN is a set of standards (fig.2) for the enhancement of current industrial Ethernet standard in terms of high speed (1 Gbps), low latency (few microseconds) and reliable communication.

Figure 1: TSN Sub standards (source: itwissen.info).

Figure 1: TSN Sub standards (source: itwissen.info).

 

TSN achieve these features by providing time synchronization to all devices and network switches which need to communicate in real-time. Moreover, the concept of the time-aware shaper (TAS) allows high priority (or time-critical) messages to be sent first.  The redundancy management feature of TSN allows the data packet replication in a network so that if the data get lost on the way, the redundant path still ensures its delivery to the destination.

Since the TSN is a base level technology (and not a standard protocol) which just like Ethernet resides at the data link layer of ISO-OSI model, it is fully compatible with other field-level communication protocols like Profinet, Ethernet/IP and OPC UA.

Figure 2: APL, TSN and OPC UA in the OSI model (Source: sps-magazine).

Figure 2: APL, TSN and OPC UA in the OSI model (Source: sps-magazine).

 

IT Level

Among the ongoing discussion of TSN compatibility with other high-end protocols, the combination OPC UA over TSN is something gaining the most popularity. Both TSN and OPC UA complement each other quite well. TSN providing fast and reliable communication at field level and OPC UA (with pub-sub extension) being the most spoken, secure and reliable protocol for the field to cloud communication.

Thanks to the idea of IIoT, the industry will soon witness the revolutionary developments in the field of factory automation which will turn the idea of real-time data transmission into reality. This is however just a beginning since there is more to come. Hopefully, the introduction of these new standards will bring the big (rival) organizations pushing their respective proprietary solutions to come together and work upon a common communication standard.

 


 

vivart kapoorVivart Kapoor is the Director of Industrial Internet of Things Solutions at Endress+Hauser in Germany. He is involved in the implementation of various IIoT projects along with business development activities. As a hobby blogger, he contributes regularly his expertise and knowledge in the field of the internet of things to several IoT expert panel or blogging sites. Vivart obtained his Bachelors in Biomedical Engineering from Manipal Institute of Technology in India and Masters in Technical Management from University of applied sciences in Emden, Germany.

 

 

Making IoT Applications Enterprise-Friendly

Jan Höller and Senthamiz Selvi A
September 16, 2019

 

The reach of IoT applications is into nearly all sectors of the global economy. The total transformative economic potential of IoT has been valued at up to 11 TUSD per year by 2025 [1].  However, the value is spread across a spectrum of applications from different sectors.

As can be understood, this diversity creates challenges in how to implement and employ the range of IoT applications, both in costs and in time to market but moreover how to effectively integrate them into the business environments of any and all enterprises. The current predominating practice of building IoT applications is to start with a single problem related to the “thing” or a machine and build a bespoke solution for it. However, the traditional bottom-up Operational Technology (OT) problem approach to an IoT solution benefits from being complemented with a top-down enterprise business process approach.

Maturing Towards the Top of the IoT Technology Stack

Turning to the IoT technology stack, one can in a sense say that the technology evolution via research and innovation has over the years gradually been sliding its focus higher and higher up the stack. From the use of IP, networking, and connectivity, to the cloud, further to the edge, and in recent years to AI. This progression is natural as firstly, the raison d’être of IoT is about insights and automation in the change of process and operational models of enterprises, and also about new business models. Secondly, and simply, technology’s usefulness is gradually maturing higher and higher up the stack.

The IoT operational and delivery model is via IoT application enabling platforms that provide horizontal tools for building applications. To date, those platforms have primarily been about infrastructure services, like connectivity, protocol adaption, device management, and secure device onboarding. The more data-centric aspects of the platform are to a large extent still rather simple and point-application focused, like visualization of insights, simpler automation tasks based on rules, or applying a specific AI-based task like predictive analytics to understand when and how a machine might fail. But what is remaining is still how such point applications can be effectively integrated and enriched into the more complex overall enterprise business processes.

As an example, the aforementioned machine failure prediction is just one event that is part of predictive maintenance, e.g. a machine cutting tool wearing out might actually be detected post-production as a different event. In the bigger picture, servicing a production machine has an impact on production plans including customer risks that need be folded into the equation, so does the availability of spare parts and service engineers, and the production line operation might need to be adjusted to shift the time for machine service to a more optimal time, all requiring the proper planning and adaptive processes, see figure 1. The machine-centric bottom-up approach to the IoT application clearly needs to be complemented with an enterprise-centric top-down approach, and the solution will also rely on a larger set of diverse AI tools, [2], all that need to interact with the enterprise systems like Enterprise Resource Planning (ERP) or Customer Relations Management (CRM).

Figure 1: A predictive maintenance scenario.

Figure 1: A predictive maintenance scenario.

 

A Complementing Enterprise-Centric Process Approach

Our approach to adding an enterprise-centric perspective to IoT applications can be summarized by the following main points.

  • Identifying high-level enterprise requirements, and modeling a top-level business process flow in an end-to-end enterprise-centric application context.
  • Further detailing of sub-processes to the level of reaching individual IoT resources and tasks that can be mapped to microservices, thus being application-independent and ideally reusable, [3]. This is a recursive activity at different interconnected levels of abstraction thus bridging the design time and run-time phases.
  • Mapping of the most granular sub-processes to functional components that can be represented by underlying IoT resources or microservices.
  • Orchestration and execution of both enterprise process relevant events and IoT event and data streams in a consistent framework of reusable components.

Our work is exploring the use of Business Process Model and Notation [4] for the enterprise process modeling part. BPMN was originally not developed to include IoT applications that involve industrial processes or OT equipment, but the exploration of applying BPMN to IoT is not new. BPMN is a promising baseline that can provide a missing link in making IoT applications enterprise-friendly but would require further development. We provide a summary of related work and point to new areas where we believe exploration and further work is needed, and the target is primarily evolving BPMN itself.

To date, the following challenges have been dealt with that either specifically address IoT, or are general and applicable to IoT.

  • Proper exposure of IoT resources as process resources in standard BPMN models, see e.g. [5] and [6] for a discussion. The introduction of IoT resources that capture real-world events and actions introduce a degree of unreliability that must be properly taken care of when modeling processes that involve IoT devices. It is important that the process design is done with clear objectives and boundaries to avoid making the process too complex and cumbersome. Extensions to BPMN to include IoT resources as process resources have been proposed, [8].
  • Stream processing for BPMN: IoT provides enormous amounts of real-time data in the form of a stream of events. The benefits of integrating these real-world data to existing business processes are valuable in making real-time decisions and optimizations. BPMN lacks abstraction mechanisms to encapsulate event streaming in the current specifications. Event Stream Processing Units (SPUs) has been proposed as an integration concept for event stream processing to integrate with business processes, [6].
  • BPMN execution and microservices orchestration: Achieving scalability becomes possible when the right level of abstraction coupled with dynamic orchestration is involved in process modeling. When combined with the right engine, BPMN makes it easy to connect tasks in a workflow to microservices and to do so in a way that doesn’t violate the principles of loose coupling and service independence. [7]
  • Extension of the BPMN Lane to include IoT resources [8]: The introduction of a new Sensing Task with native software components referencing the IoT Domain Model of the IoT-Architecture as a combination of swim lane and process activity-centric resource model. Integration of the semantic model as parameters to the new task, extension and practical testing of the graphical model of a business process modeling tool is presented.

Given the above progress and considering further anticipated needs for applying BPMN for IoT, we also propose the following new areas to explore.

  • Separation of business process event and logic from IoT data streams. This implies an association but separation of enterprise event to underlying microservices that produce and consume IoT event streams, e.g. model training on IoT data, inferencing, etc. that will become continuous background activities separated from the enterprise process events. A BPMN engine can then be used for both the orchestration of microservices and the execution of the enterprise events separate from the underlying IoT event streams.
  • As IoT processing generally is distributed in physical space, enterprise sub-processes will benefit from being distributed too for different reasons. This consideration needs to be taken into account throughout the modeling process from the top level of abstraction to the distribution of microservices across e.g. a cloud and factory shop floor. This will require extensions in modeling to capture requirements and diverse constraints, both from an OT perspective and from a distributed cloud and edge computing perspective.

BPMN also has the capability to model the interaction between a number of different participants and hence lends itself well to explore IoT-centric processes that span across value networks of collaborating actors. The separation of business events from IoT event streams would also allow an interconnection across actors at those different levels as needed.

Figure 2: A conceptual framework to bridge enterprise processes with IoT.

Figure 2: A conceptual framework to bridge enterprise processes with IoT.

 

Our approach to harness the power of IoT enabled business process modeling is conceptually summarized in figure 2. If we take the example of Predictive Maintenance use case by taking into consideration a top-level business process (Level 0) as in Figure 1, this can be further detailed to subsequent levels depending on the decomposition of subprocesses involved in the higher level flows, and eventually orchestrate the underlying  well defined microservices in the IT infrastructure layer. Level 0 of the business process will depict the flow of events at 30,000 ft. view of the system integration involving both the IT and OT functions of the manufacturing organization. Level 1 will further decompose the process into subprocesses of specific aspects of the organizational Lanes and so on. As one drills to the atomic activity (say Level 5) which has the implementation service hook to the underlying  IoT resources and microservices, we observe the full potential of seamlessly bridging the enterprise perspective with the IoT perspective covering both the design, orchestration, and execution phases.

Conclusions

In order to capture the full potential of IoT across different applied sectors, means for effectively developing and integrating IoT applications that consider the richer and wider enterprise needs are still missing, but baseline work exists that can be further exploited. Our approach builds previous work, and we propose further work that can help close the gap between the enterprise process-centric and IoT-centric perspectives that we believe will make IoT applications more enterprise-friendly.

References

  1.  “The internet of things: mapping the value beyond the hype”, McKinsey Global Institiute, 2015, available from https://goo.gl/V2fnJm
  2. J. Höller, V. Tsiatsis, and C. Mulligan, “Toward a Machine Intelligence Layer for Diverse Industrial IoT Use Cases,” IEEE Intelligent Systems, vol. 32, no. 4, pp. 64–71, 2017.
  3. “Accelerating Industrial IoT Application Development through Reusable AI Components”. Senthamiz Selvi Arumugam, Ramamurthy Badrinath, Aitor Hernandez Herranz, Jan Höller, Carlos R. B. Azevedo, Bin Xiao, Valentin Tudor, “Accelerating Industrial IoT Application Deployment through Reusable AI Components”, in proceedings from IEEE GIoTS 2019.
  4. Business Process Model and Notation, available from https://www.bpmn.org
  5. Haller, Stephan and Carsten Magerkurth. “The Real-time Enterprise: IoT-enabled Business Processes.” (2011).
  6. Appel, S., Kleber, P., Frischbier, S., Freudenreich, T. & Buchmann, A. (2014). Modeling and execution of event stream processing in business processes. Information Systems, 46, 140–156.
  7. “BPMN and Microservice Orchestration” https://zeebe.io/blog/2018/08/bpmn-for-microservices-orchestration-a-primer-part-1/
  8. Meyer, S., Ruppen, A. & Magerkurth, C. (2013). Internet of things-aware process modeling: integrating iot devices as business process resources. In International conference on advanced information systems engineering (pp. 84–98).

 

Jan HollerJan Höller is a Research Fellow at Ericsson Research where he is responsible for defining and driving the IoT technology and research strategies, and also to contribute to the corresponding corporate strategies. He established Ericsson’s research activities on IoT over a decade ago. Jan is a co-author of the book "Internet of Things: Technologies and Applications for a New Age of Intelligence" that was recently released in its 2nd edition. He has held various positions in Strategic Product Management, Technology Management and has, since he joined Ericsson Research in 1999, led different research activities and research groups. He has for a number of years served on the Board of Directors at the IPSO Alliance, the first IoT alliance formed back in 2008. Jan currently serves on the Board of Directors of the Open Mobile Alliance and is a chairing the Networking Task Group in the Industrial Internet Consortium. He is a frequent speaker at industrial and academic conferences and events.

 

Senthamiz Selvi ASenthamiz Selvi A is an Experienced Researcher at Ericsson Research, India, currently focusing on the areas of the Internet of Things and Artificial Intelligence. She is an HFI-Certified Usability Analyst. Selvi joined Ericsson in 2009 working on the Business Support System portfolio products as a Software Architect before moving to the corporate research unit. She received her bachelor's degree in Computer Science and Technology from Usha Mittal Institute of Technology, Mumbai. Her current research interests include IoT, Machine Learning, Distributed Ledger Technologies and Business Process Modelling.

 

 

Why Trusted Execution Environment and Security by Separation on IoT Edge Devices Are Important?

Carlos Moratelli, Ramão Tiago Tiburski, Sergio Johann, Everton de Matos, and Fabiano Hessel
September 16, 2019

 

 

The Internet is changing quickly into a model where billions of everyday objects will be interconnected, which we call the Internet of Things (IoT). Traditionally, IoT devices communicate directly with the Cloud, but that is changing to a layered architecture. The direct IoT-Cloud model works poorly for a fair share of all applications.

For example, the amount of data generated by sensors will be prohibitive in some instances, as seen on connected cars, which can create tens of megabytes each second [1]. Some long-range, low bandwidth radio technologies, like that provided by SigFox or NB-IoT, are charged by communication amount, making it desirable to minimize the data exchange. Some applications require fast response, as voice recognition. In this context, Fog and Edge layers were added to the IoT architecture [1] as an alternative to diminish the cloud communication and perform faster response. Therefore, sensors will communicate with nearby devices (edge) interconnected by medium-range networks (fog). Data processing and local decisions will be performed at the Fog/Edge layers avoiding additional communication with the Cloud. [2].

Behind the IoT well-known benefits, it is hidden an obscure treat: the digital security risks. Security emerged as one of the most critical concerns for the broad IoT adoption. Wonder the Internet flooded by millions of potentially vulnerable edge devices with significant processing power. Such devices will bring their vulnerabilities to private networks, turning the Internet into a fertile environment for hackers willing to steal sensitive information or to perform denial-of-service and denial-of-sleep attacks. Improvements in the security of all devices connected to the Internet are a vital concern for the future of the IoT [3].

In the run for safer devices, a realm of technologies can be applied. In this article, we discuss how two fundamental security trends can be put together to build the foundation of IoT edge security. First, the use of the Trusted Execution Environment (TEE) is essential to guarantee software and data integrity. A TEE requires separation to allow the concurrent execution of multiple isolated flows, so security by separation is also addressed.

Figure 1: Establishing a CoT from hardware to higher software layers.

Figure 1: Establishing a CoT from hardware to higher software layers.

 

Trusted Execution Environment (TEE)

As many edge devices are placed in public environments with easy access by non-authorized personnel, it is necessary to guarantee that the running software was not modified or changed maliciously. Even devices without a physical interface may be attacked by having its code or data changed remotely. A TEE allows for the detection of unintentionally software substitution, consisting of a protected machine’s memory area. In this environment, application code and data are verified for confidentiality and integrity using cryptography before execution [4]. There are two elements to build a TEE: the Root of Trust (RoT) and the secure-boot process. Both parts result in the called Chain of Trust (CoT).

The RoT is a trusted element that cannot be changed and constitutes the foundations for the device’s software integrity [4]. A typical implementation approach consists of hardware capable of performing software verification based on a cryptography key stored in a write-once memory. The chip’s manufacturer is responsible for providing support for verification and storage memory. In this scheme, developers are responsible for the software stack, so updates are possible even on devices already in the field.

The RoT allows for the secure boot process, where only verified software can be executed on the device’s startup. Therefore, this mechanism involves a set of verification at all layers of the system’s software until the level of the application, implementing end-to-end security, and defining a Chain of Trust (CoT). Figure 1 describes this scheme in a multi-layered environment, which includes an embedded virtualization layer. First, the hardware authenticates the bootloader. If successful, it is considered a trusted element and is allowed to verify the next boot stage. In this case, the next boot stage to be trusted is the hypervisor. Once verified, the hypervisor boots up and check its domains before their boot. Note that, non-trusted domains can coexist along with trusted areas in the same device which will be better addressed in the following sections.

Security by Separation

Current edge devices require significant processing power in order to handle sensor data, decision making and to communicate over the Fog layer. As a consequence, software complexity increases, and multiple separated execution flows are required. Enforcing isolation between the flows in a lightweight way, while still maintaining the TEE, can be challenging. In a compromised system, an attacker may try to spread its attack to other subsystems, taking control of all possible functionalities: this is called lateral movement and is seen as a widely used tactic. Separation can be used to avoid lateral movement, thus, helping to keep the TEE integrity. One way to achieve separation is by using virtualization, which is capable of creating logical isolation and allowing multiple applications to share the underlying hardware, unaware of other instances.

Although virtualization is a well-established technology in the Cloud, IoT virtualization is still in development. The requirements for embedded systems virtualization differs from enterprise systems, as restrictions about response time, processing power, memory size, and battery life are the primary concerns. The natural starting point for embedded virtualization was to adapt hypervisors widely used in server virtualization to embedded systems. However, their size and complexity proved to be unacceptable for small embedded devices, which motivated the appearance of hypervisors specially designed for embedded virtualization, as seen in Tiburski et al. [5]. Among the goals for the development of embedded hypervisors, two of them are frequently addressed: to keep low memory requirements and some level of support for real-time applications.

A strategy to make hypervisors lightweight is to simplify or even cut-off subsystems that are not necessary for embedded systems. Although memory management is essential for virtualization, since it provides the basis for separation, it must be adapted for IoT. For example, the swapping subsystem is unnecessary, and the paging implementation can be radically reduced, while a strong separation between domains is still enforced. The memory management can still be simpler if the processor implements hardware support for virtualization. If carefully designed, virtualization can provide security by separation on devices even smaller than that reached by containerization. Although containerization is known as lightweight virtualization, it still requires an underline operating system (OS), like Linux. Hypervisors for IoT are implemented as bare-metal, also known as type-1, controlling the hardware directly and dismissing an underline OS.

Conclusion

The combination of TEE and virtualization can be used to provide integrity checks over multiple domains. The hypervisor guarantees that, once an area is compromised, the attack will not spread over other domains, hence, allowing the coexistence of trusted and non-trusted environments. Additionally, different vendors can deliver their own with custom application-defined signatures. Therefore, it is possible to verify an application for non-repudiation, avoiding vendors to deny his responsibility or role. Finally, virtualization can go still more deeply than containerization on embedded systems, allowing cheaper devices to be used on the edge.

References

  1. M. Chiang and T. Zhang. Fog and IoT: An Overview of Research Opportunities. IEEE Internet of Things Journal, 3(6):854–864, Dec 2016
  2. OpenFog Consortium. OpenFog Reference Architecture for Fog Computing. Technical report, 02 2017.
  3. PeiYun Zhang, Mengchu Zhou, and Giancarlo Fortino. Security and trust issues in Fog computing: A survey. Future Generation Computer Systems, 88, 05 2018.
  4. M. Sabt, M. Achemlal, and A. Bouabdallah. Trusted Execution Environment: What It is, and What It is Not. In IEEE Trustcom, volume 1, pages 57–64, Aug 2015.
  5. R. T. Tiburski, C. R. Moratelli, S. F. Johann, M. V. Neves, E. d. Matos, L. A. Amaral, and F. Hessel. Lightweight security architecture based on embedded virtualization and trust mechanisms for iot edge devices. IEEE Communications Magazine, 57(2):67–73, February 2019.

 


 

Carlos Roberto MoratelliCarlos Roberto Moratelli received his Ph.D. in computer science from PUCRS. He is an adjunct professor at UFSC. He worked ten years in the telecommunication industry, acting on software engineering related to embedded systems. His research interests are embedded real-time systems, Linux Embedded, and virtualization for embedded systems.

 

Ramao Tiago TiburskiRamão Tiago Tiburski received his M.S. degree in computer science from PUCRS. He is a Ph.D. student of computer science at PUCRS and a professor at Federal Institute of Santa Catarina (IFSC). His research interests are IoT, fog and edge computing, and security for IoT resource-constrained devices.

 

Sergio F JohannSérgio F. Johann (sergio.filho@pucrs.br) received his Ph.D. degree in computer science from PUCRS. He is an adjunct professor at PUCRS, Brazil. He has experience in computer architecture design and organization, operating systems, embedded systems (design and integration), embedded software support, real-time systems, and control systems.

 

Everton de MatosEverton de Matos received his M.S. degree in computer science from PUCRS. He is an adjunct professor at Meridional Faculty (IMED). He is a Ph.D. student of computer science at PUCRS. His research interests are IoT, middleware, fog and edge computing, context-awareness, and context sharing.

 

Fabiano HesselFabiano Hessel (IEEE Member) is Full Professor of Computer Science at PUCRS. He received his Ph.D. in computer science from UJF, France (2000). He has experience as a General and Program Chair in several committees of prestigious conferences and journals. His research interests are embedded real-time systems, RTOS and MPSoC systems applied to IoT/SmartCities.

 

 

Breaching Boundaries and Building Frameworks - Tapping into Future IoT Industry Growth with oneM2M

Chonggang Wang
September 16, 2019

 

In the earliest days of the Internet of Things (IoT), when the market first began to take shape, solution providers centered their focus on connectivity. This – the intricacies involved in connecting billions of devices and assets to the Internet – was just the beginning.

In 2015, a macro-assessment of IoT carried out by worldwide management consultancy, McKinsey, quantified the market opportunity in nine categories, ranging from home automation, factory, personal monitoring (health and fitness), smart city applications and more. Within these verticals and domains, there can be hundreds, or thousands, of individual applications and so there is vast potential for technology providers and connectivity service businesses to target possible new connected devices.

Fast-forward to today, however, and the impetus is shifting. Increasingly, the primary objective for the industry is now data. Specifically, how to understand, interpret and handle the data that comes from connected assets. Operators, developers, and service providers are aiming to find ways to connect, manage, analyze and, eventually, share data among different organizations. Which will, in turn, will break down boundaries and maximize potential industry growth.

The Value of Data Models

The heterogeneous nature of IoT introduces another layer of complexity, in addition to the issue of large scale and distributed data management. Making data interoperable - considering the different protocols, architecture, and standards used across this ‘system of systems’ - between partners in a supply chain, across application silos or between vendors of interchangeable devices and sensors is a challenge.

A solid framework for a good data model can solve these issues. In other words, IoT data modeling offers an approach which could more efficiently describe, interpret, analyze and share data among heterogeneous IoT applications and devices.

Fortunately, several data models already exist. Many of them have been developed by different Standards Development Organizations. Some of them are for specific IoT vertical applications or domains.  For example, Smart Appliances REFerence (SAREF) provides a shared model for home appliances. Data models from the Open Geospatial Consortium (OGC) are more for geosciences and environment domain. The Open Connectivity Foundation (OCF) specifies data models based on vertical industries such as automotive, healthcare, industrial and the smart home. The World Wide Web Consortium (W3C) Thing Description (TD) provides some vocabularies to describe physical things but does not have much focus on data.

In contrast, Schema.org operates as a collaborative community and it aims to provide more general and broad vocabularies and schemas for structured data on the Internet. A collaborative approach to integrate and unify various data models is critical and necessary to work across IoT application and organizational boundaries. It requires cooperative efforts among different industry bodies.

oneM2M’s Role in Data Model Standardization

oneM2M, the global standardization body, was established to create a body of maximally reusable standards to enable IoT applications across different verticals. oneM2M focuses on common services, such as semantics, device management, and security, that are needed in all IoT applications. The oneM2M standard takes the role of a horizontal IoT service layer between applications and the underlying connectivity networks. In doing so, it masks technology complexity for application developers and hardware providers.

In other words, oneM2M supports the capability to interwork to various local/proximal networks. Each of these proximal networks tends to use their own data model.  As part of the interworking framework of oneM2M, it provides a layer of abstraction by interworking these data models together with one another via the oneM2M resource and data model. As shown in Figure 1, applications thus can communicate using one data model (i.e. oneM2M resource and data model) and oneM2M handles the translations to the various proximal network data models - simplifying applications.

Figure 1: oneM2M provides interworking and mapping between oneM2M and various vertical data models.

Figure 1: oneM2M provides interworking and mapping between oneM2M and various vertical data models.

 

Amongst its various standardization efforts, oneM2M takes a collaborative approach in developing its data model-related specifications. For example, one of its technical specifications, TS-0005 for management data model, is the result of a collaboration between oneM2M and the Open Mobile Alliance (OMA).

Another specification, TS-0023, laid the groundwork for a Smart Devices Template (SDT) that was first applied to create a Home Appliances Information Model and Mapping. In the next release of oneM2M, Release 4, the underlying data model principles will be extended to support information modeling and mapping for vertical industries, such as smart cities.

Tapping into the Potential for IoT Data Models

The impact of IoT is already far-reaching, but the full promise of it is yet to be realized. The renewed focus on frameworks to manage data models across application verticals and domains sets the stage for unlocking the next phase of the IoT industry’s growth.

Harnessing the array of IoT technologies and the data generated by them, recognizing the true value of data-driven analytics, continuing to make gains with standardization and improving cross-company collaboration, means we can create more space for increased innovation, maximal efficiency and to realize the full game-changing potential of the IoT, which is undoubtedly going to help shape our future.


 

Chonggang WangChonggang Wang (Chonggang.Wang@InterDigital.com) received his Ph.D. degree from Beijing University of Posts and Telecommunications in 2002. He is a principal engineer at InterDigital, Inc. where his research interests span quantum internet, IoT architecture and protocols, semantics computing and services for IoT, intelligent and autonomous IoT systems, edge computing. He is a Fellow if the IEEE for his contributions to IoT enabling technologies. He is also a member of Convida Wireless, which is a joint venture partnership between InterDigital and Sony focused on IoT research and development.