Book Image

Hands-On Industrial Internet of Things

By : Giacomo Veneri, Antonio Capasso
Book Image

Hands-On Industrial Internet of Things

By: Giacomo Veneri, Antonio Capasso

Overview of this book

We live in an era where advanced automation is used to achieve accurate results. To set up an automation environment, you need to first configure a network that can be accessed anywhere and by any device. This book is a practical guide that helps you discover the technologies and use cases for Industrial Internet of Things (IIOT). Hands-On Industrial Internet of Things takes you through the implementation of industrial processes and specialized control devices and protocols. You’ll study the process of identifying and connecting to different industrial data sources gathered from different sensors. Furthermore, you’ll be able to connect these sensors to cloud network, such as AWS IoT, Azure IoT, Google IoT, and OEM IoT platforms, and extract data from the cloud to your devices. As you progress through the chapters, you’ll gain hands-on experience in using open source Node-Red, Kafka, Cassandra, and Python. You will also learn how to develop streaming and batch-based Machine Learning algorithms. By the end of this book, you will have mastered the features of Industry 4.0 and be able to build stronger, faster, and more reliable IoT infrastructure in your Industry.
Table of Contents (18 chapters)

IoT background

Over the last few years, the IoT has become a viral topic in the digital world, one that is discussed, debated, and analyzed in many channels, forums, and contexts. This is common for all new or emerging software technologies. Developers, architects, and salespeople discuss the capabilities, impacts, and market penetration of the new technology in forums, blogs, and specialized social media. Think, for example, of Docker or Kubernetes, which are changing the ways in which software applications are designed, implemented, and deployed. They are having a tremendous impact on the digital world in terms of time to market, software life cycle, capabilities, and cost. Despite this, they remain primarily confined to their specialized fields. This is not the case for the IoT.

Over the last 10 years, the IoT has become a familiar topic in the mass media. This is because the IoT is more than just a new technology that impacts a restricted range of people or a specific market. It can be better understood as a set of technologies that impacts us all, and will change markets, even creating new ones. The IoT is changing our lives, feelings, and perceptions of the physical world daily, by modifying how we interact with it. The development of the IoT is a crucial moment in the history of humanity because it is changing our mindset, culture, and the way we live. Just like the internet age, we will have a pre-IoT phase and a post-IoT phase. The IoT era will be not an instantaneous transition, but a gradual and continuous shift during which the evolution never stops.

Currently, we are just at the beginning of this journey. Like the arrival of e-commerce or mobile applications, there is a certain time lag between when you hear about an upcoming technology and when it actually exists in the real world. But the change has started. We are moving toward a world in which we interact increasingly not with physical objects, but with their digital images that live in the cloud and communicate with other digital images. These images are integrated through a powerful injection of digital intelligence, which makes them capable of suggesting actions, making decisions autonomously, or providing new and innovative services. You might currently be able to regulate your heating system remotely, but if it lived in the cloud and received information from your car, your calendar, your geolocation, and the weather, then your heating system would be able to regulate itself. When an object lives in the cloud and interacts with other digital images in a web of artificial intelligence, that object becomes a smart object.

These developments might seem to be paving the way for a new and perfect world, but there is a dark side to the IoT as well. A lot of personal data and information is stored in the cloud, in order that artificial intelligence can extrapolate information about us and profile our behaviors and preferences. From a different perspective, therefore, the cloud could also be seen as a sort of Big Brother, as in George Orwell's novel 1984. There is the possibility that our data and profiles could be used not just to enhance our lifestyles, but also for more malicious purposes, such as political or economic influence on a large scale.

An example of this was the Cambridge Analytica scandal that occurred in March 2018. It was widely reported at the time that this company had acquired and used the personal data of Facebook users from an external researcher who had told Facebook he was collecting it for academic purposes. This researcher was the founder of the third-party app thisisyourdigitallife, which was given permission by its 270,000 users to access their data back in 2015. By providing this permission, however, these users also unknowingly gave the app permission to access to the information of their friends, which resulted in the data of about 87 million users being collected, the majority of whom had not explicitly given Cambridge Analytica permission to access their data. Shortly afterward, the media aired undercover investigative videos revealing that Cambridge Analytica was involved in Donald Trump's digital presidential campaign.

This book will not go into detail about the social, legal, political, or economic impacts of the IoT, but we wanted to highlight that it does have a dark side. More often than not, human history has demonstrated that a technology is not good or bad in itself, but instead becomes good or bad depending on how it is used by humans. This is true for the IoT. Its power is tremendous, and it only just starting to be understood. Nobody yet knows how the IoT will develop from now on, but we are all responsible for trying to control its path.

History and definition

The IoT as a concept wasn't officially named until 1999. One of the first examples of the IoT was a Coca-Cola machine, located at the Carnegie Mellon University (CMU) in the early 1980s. Local programmers would connect through the internet to the refrigerated appliance checking to see if there was a drink available and whether it was cold before making a trip to it.

Kevin Ashton, the Executive Director of Auto-ID Labs at MIT, was the first to describe the IoT in a presentation for Procter and Gamble. During his 1999 speech, Mr. Ashton stated as follows:

Today, computers, and therefore the Internet, are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet was first captured and created by human beings by typing, pressing a record button, taking a digital picture, or scanning a bar code. The problem is, people have limited time, attention, and accuracy, all of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”

Kevin Ashton believed that radio-frequency identification (RFID) was a prerequisite for the IoT, and that if all devices were tagged, computers could manage, track, and inventory them.

In the first decade of the 21st century, several projects were developed to try to implement and translate into the real world the IoT philosophy and Ashton's innovative approach. These first attempts, however, were not so successful. One of the most famous and emblematic cases was the Walmart mandate (2003). By placing RFID tags with embedded circuits and radio antennas on pallets, cases, and even individual packages, Walmart was supposed to be able to reduce inefficiencies in its massive logistics operations and slash out-of-stock incidents, thus boosting same-store sales.

In 2003, Walmart started this pilot project to put RFID tags, carrying electronic product codes, on all pallets and cases involving all of its suppliers. In 2009, Procter and Gamble, one of the main suppliers involved in the project, stated that it would exit from the pilot project after validating and checking the benefits of RFID in merchandising and promotional displays.

The unsuccessful story of the Walmart RFID project was caused by various factors:

  • Most of the technologies used were in their initial stages of development and their performance was poor. They had sensors with little information, and Wi-Fi or LAN connectivity with high power and bandwidth usage.
  • The sensors and connectivity devices were expensive due to the small market size.
  • There were no common standards for emerging technologies, and there was a lack of interoperability between legacy systems.
  • Business cases were not very accurate.
  • The technology infrastructure and architecture was organized in vertical silos with legacy hardware and middleware, and a lack of interactions between each silo.
  • Technology infrastructure and software architecture was based on a client-server model that still belonged to the so-called second digital platform.

From 2008, several changes were introduced to deal with the preceding issues, which were led mainly by the mobile market. These included the following:

  • New higher-performing processors were produced on a large scale at lower cost. These processors supported commercial and/or open operating systems.
  • New sensors, which were much more developed, with computation capabilities and high performance embedded at a low cost.
  • New network and wireless connectivity, which allowed the user to interconnect the devices with each other and to the internet by optimizing bandwidth, power consumption, latency, and range.
  • Sensors and devices using commercial off-the-shell (COTS) components.
  • The third, cloud-based, digital platform.

Due to these changes, the IoT evolved into a system that used multiple technologies. These included the internet, wireless communication, micro-electromechanical systems, and embedded systems such as the automation of public buildings, homes, factories, wireless sensor networks, GPS, control systems, and so on.

The IoT consists of any device with an on/off switch that is connected to the internet. If it has an on/off switch, then it can, theoretically, be part of a system. This includes almost anything you can think of, from cell phones, to building maintenance, to the jet engine of an airplane. Medical devices, such as a heart monitor implant or a bio-chip transponder in a farm animal, are also part of the IoT because they can transfer data over a network. The IoT is a large and digital network of things and devices connected through the internet. It can also be thought of as a horizontal technology stack, linking the physical world to the digital world. By creating a digital twin of the physical object in the cloud, the IoT makes the object more intelligent thanks to the interaction of the digital twin with the other digital images living in the cloud.

IoT enabling factors

What has changed from the Walmart scenario to make companies support the advent of the IoT? There is no one specific new technology or invention, but rather a set of already existing technologies that have been developed. These have created an ecosystem and a technology environment which makes the connection of different things possible, efficient, and easy from a technical perspective, profitable from a market perspective, and attractive from a production-cost perspective.

The technologies that have led to the evolution of IoT ecosystems are as follows:

  • New sensors that are much more mature, have more capabilities, and offer high performance at a lower cost. These smart sensors are natively designed to hide the complexity of the signal processing, and they interact easily through a digital interface. The smart sensor is a system itself, with a dedicated chip for signal processing. The hardware for signal processing is embedded in each sensor and miniaturized to the point that it is part of the sensor package. Smart sensors are defined by the IEEE 1451 standard as sensors with a small memory and standardized physical connections to enable communication with the processor and the data network. As well as this, smart sensors are the combination of a normal sensor with signal conditioning, embedded algorithms, and a digital interface. The principal catalyst for the growth of smart-sensing technology has been the development of microelectronics at reduced cost. Many silicon manufacturing techniques are now being used to make not only sensor elements, but also multilayered sensors and sensor arrays that are able to provide internal compensation and increase reliability. The global smart sensor market was evaluated at between $22 and $25.96 billion in 2017. It is forecast to reach between $60 and $80 billion by the end of 2022.
  • New networks and wireless connectivity, such as personal area networks (PANs) or low power networks (LPNs), interconnect sensors and devices in order to optimize their bandwidth, power consumption, latency, and range. In PANs, a number of small devices connect directly or through a main device to a LAN, which has access to the internet. Low-Power Wide-Area Networks (LPWANs) are wireless networks designed to allow long-range communications at a low bit rate among battery-operated devices. Their low power, low bit rate, and their intended use distinguish these types of network from the already existing wireless WAN, which is designed to connect users and businesses and carry more data, using more power. (More information can be found on WANs at https://en.wikipedia.org/wiki/Wireless_WAN.)
  • New processors and microprocessors coming from the world of mobile devices. These are very powerful and very cheap. They have produced a new generation of sensors and devices based on standardized and cheap hardware that is driven by open and generic operating systems. These use common software frameworks as an interface, allowing you to transition from a legacy solution, with strictly coupled hardware and software, to a platform built on the COTS component and the adoption of an open software framework.
  • The battle of the real-time operating system (RTOS) to gain a larger slice of new markets between the big market players. This places more sophisticated and powerful integrated development platforms at the maker's disposal.
  • Virtualization technology, which divides naturally into the data center, big data and the cloud. This leads to the following features:
    • CPUs, memory, storage, infrastructures, platforms, and software frameworks available as services on demand, with flexible and tailored sizing. These are cheap and available without capital expenditure (CAPEX) investment.
    • Elastic repositories for storing and analyzing the onslaught of data.
    • The profitable and flexible operational expenditure (OPEX) model per CPU, memory, storage, and IT maintenance services. This creates a business case for migrating the legacy data, infrastructure, and applications to the cloud, and making the collection of big data and subsequent analytics possible.
  • The convergence of IT and operational technology (OT). This has led to the increasing adoption of COTS components in sectors in which hardware was traditionally developed with specific requirements such as is the case in industrial plants.
  • The diffusion of mobile devices and social networks has created a culture and a generic mindset with an embedded expectation for the market consumers to encounter the world through an app which shares related information.

The preceding factors are making it possible to transition from a vertical, legacy platform with an application organized hierarchically, with the data confined in silos, to a horizontal, modular, and cloud-based platform. This new platform uses a standardized API layer that provides high interoperability capabilities and the ability to share data and information between applications.

Let's consider what might happen if the Walmart project was carried out now. In 2003, the only RFID technology that existed was active RFID systems. Active RFID systems use battery-powered RFID tags that continuously broadcast their own signals. They provide a long-read range, but they are also expensive and consume a lot of power. Passive RFID systems, on the other hand, use tags with no internal power source, and are instead powered by the electromagnetic energy transmitted from an RFID reader. They have a shorter read range, but they are unembeddable, printable, and much cheaper, which makes them a better choice for many industries. Also, at the time of the Walmart project, there were no PANs or LPNs to capture and transmit the label data, meaning the developers had to adopt an expensive, wired connection to transfer the information. The data was then stored in a legacy database and processed by a custom application. If the Walmart project were to be carried out now, instead of in 2003, the tracking information could be carried out by passive RFIDs. The data could be captured by a PAN and transmitted via the cloud to be processed by an application built on top of a common API and framework. This means that all data and information could be easily shared between the project partners. According to Forbes and Gartner, the IoT market and connected devices is expected to grow strongly in the next year, as shown by the following statistics:

IoT use cases

As previously discussed, the IoT is not just a specific technological innovation, but a radical change that will impact the whole of human society. This means that the IoT will affect nearly every aspect of our personal and professional lives and any sector of the economy, including the following:

  • Industrial and manufacturing
  • Supply chain
  • Retail
  • Financial and marketing
  • Healthcare
  • Transportation and logistics
  • Agricultural and environmental
  • Energy
  • Smart cities
  • Smart Homes and Buildings
  • Government and military
  • Security forces
  • Education
  • Sports and fitness

All of these are already involved in the digital transformation that has been caused by the IoT, and are likely to play a greater role in this in the future.

Across all uses of the IoT, the common feature is the smart object. From a qualitative perspective, a smart object is a multidisciplinary object which includes the following elements:

  • The physical product.
  • Sensors, microprocessors, data storage, controls, and software, managed by an embedded operating system.
  • Wired or wireless connectivity, including interfaces and protocols. This is used to connect the product to its user, all instances of the product to its vendor, or the product to other types of products and external data sources.

In another definition, the article How Smart, Connected Products Are Transforming Competition, written by Michael E. Porter and James E. Heppelmann, details four increasing levels that classify the smartness of an object or product:

  • Monitoring: Monitoring of product conditions, external operation, and usage. This enables alerts and notifications of changes.
  • Control: Software embedded in the product or in the cloud enables control of product functions, and/or personalization of the user experience.
  • Optimization: The previous capabilities are used to create algorithms to optimize the product's operation and use. They enhance product performance, and/or allow features such as predictive diagnostics, service, repair, and so on.
  • Autonomy: The combination of the previous capabilities produces a product with autonomous control for self-coordination with other systems and/or self-diagnosis or services.