Book Image

Hands-On Artificial Intelligence for IoT - Second Edition

By : Amita Kapoor
Book Image

Hands-On Artificial Intelligence for IoT - Second Edition

By: Amita Kapoor

Overview of this book

There are many applications that use data science and analytics to gain insights from terabytes of data. These apps, however, do not address the challenge of continually discovering patterns for IoT data. In Hands-On Artificial Intelligence for IoT, we cover various aspects of artificial intelligence (AI) and its implementation to make your IoT solutions smarter. This book starts by covering the process of gathering and preprocessing IoT data gathered from distributed sources. You will learn different AI techniques such as machine learning, deep learning, reinforcement learning, and natural language processing to build smart IoT systems. You will also leverage the power of AI to handle real-time data coming from wearable devices. As you progress through the book, techniques for building models that work with different kinds of data generated and consumed by IoT devices such as time series, images, and audio will be covered. Useful case studies on four major application areas of IoT solutions are a key focal point of this book. In the concluding chapters, you will leverage the power of widely used Python libraries, TensorFlow and Keras, to build different kinds of smart AI models. By the end of this book, you will be able to build smart AI-powered IoT apps with confidence.
Table of Contents (20 chapters)
Title Page
Copyright and Credits
Dedication
About Packt
Contributors
Preface
Index

Big data and IoT


IoT has connected things never previously connected to the internet, such as car engines, resulting in the generation of a large amount of continuous data streams. The following screenshot shows explorative data by IHS of the number of connected devices in billions in future years. Their estimate shows that the number of IoT devices will reach 75.44 billion by 2025:

 Prediction about the growth of IoT devices by 2025

Note

The full whitepaper, IoT platforms: enabling the Internet of Things, by IHS is available as PDF at: https://cdn.ihs.com/www/pdf/enabling-IOT.pdf.

The reduction in sensor cost, efficient power consumption techniques, a large range of connectivity (infrared, NFC, Bluetooth, Wi-Fi, and so on), and the availability of cloud platforms that support IoT deployment and development are the major reasons for this pervasion of IoT in our homes, personal lives, and industry. This has also motivated companies to think about providing new services and developing new business models. Some examples include the following:

  • Airbnb: It connects people so that they can rent out spare rooms and cottages to one another, and it earns the commission.
  • Uber: It connects cab drivers with travelers. The location of the traveler is used to assign them to the nearest driver. 

The amount of data generated in the process is both voluminous and complex, necessitating a big data. Big data approach and IoT are almost made for each other; the two work in conjunction. 

Things are continuously generating an enormous amount of data streams that provide their statuses such as temperature, pollution level, geolocation, and proximity. The data generated is in time series format and is autocorrelated. The task becomes challenging because the data is dynamic in nature. Also, the data generated can be analyzed at the edge (sensor or gateway) or cloud. Before sending the data to the cloud, some form of IoT data transformation is performed. This may involve the following:

  • Temporal or spatial analysis
  • Summarizing the data at the edge
  • Aggregation of data
  • Correlating data in multiple IoT streams
  • Cleaning data
  • Filling in the missing values
  • Normalizing the data
  • Transforming it into different formats acceptable to the cloud

At the edge, complex event processing (CEP) is used to combine data from multiple sources and infer events or patterns. 

The data is analyzed using stream analytics, for example, applying analytical tools to the stream of data, but developing the insights and rules used externally in an offline mode. The model is built offline and then applied to the stream of data generated. The data may be handled in different manners:

  • Atomic: Single data at a time is used
  • Micro batching: Group of data per batch
  • Windowing: Data within a timeframe per batch

 The stream analytics can be combined with the CEP to combine events over a time frame and correlate patterns to detect special patterns (for example, anomaly or failure).