Book Image

Artificial Intelligence for IoT Cookbook

By : Michael Roshak
Book Image

Artificial Intelligence for IoT Cookbook

By: Michael Roshak

Overview of this book

Artificial intelligence (AI) is rapidly finding practical applications across a wide variety of industry verticals, and the Internet of Things (IoT) is one of them. Developers are looking for ways to make IoT devices smarter and to make users’ lives easier. With this AI cookbook, you’ll be able to implement smart analytics using IoT data to gain insights, predict outcomes, and make informed decisions, along with covering advanced AI techniques that facilitate analytics and learning in various IoT applications. Using a recipe-based approach, the book will take you through essential processes such as data collection, data analysis, modeling, statistics and monitoring, and deployment. You’ll use real-life datasets from smart homes, industrial IoT, and smart devices to train and evaluate simple to complex models and make predictions using trained models. Later chapters will take you through the key challenges faced while implementing machine learning, deep learning, and other AI techniques, such as natural language processing (NLP), computer vision, and embedded machine learning for building smart IoT systems. In addition to this, you’ll learn how to deploy models and improve their performance with ease. By the end of this book, you’ll be able to package and deploy end-to-end AI apps and apply best practice solutions to common IoT problems.
Table of Contents (11 chapters)

Enriching data using Kafka's KStreams and KTables

Often, in IoT, there are external data sources that must be included. This could be weather data that affects the performance of the device or data from other nearby devices. An easy way of doing this is to use Kafka KSQL Server. Like we did in the previous recipe, we are going to use Confluent Cloud's KSQL Server, which you can get if you have a Confluent Cloud subscription.

In this recipe, we are going to get data from a weather service topic and put it into a KTable. A KTable is similar to a database table. All data coming into Kafka comes in key-value pairs. With KTable, when data comes in with a new key, we insert it into our KTable. If it contains a key that already exists in our KTable, we update it. We are also going to convert our topic into a KStream. This allows us to run standard SQL-like queries over our table and stream. By doing this, we can, for example, query the current weather and join it to the engine data...