Book Image

Artificial Intelligence By Example - Second Edition

By : Denis Rothman
Book Image

Artificial Intelligence By Example - Second Edition

By: Denis Rothman

Overview of this book

AI has the potential to replicate humans in every field. Artificial Intelligence By Example, Second Edition serves as a starting point for you to understand how AI is built, with the help of intriguing and exciting examples. This book will make you an adaptive thinker and help you apply concepts to real-world scenarios. Using some of the most interesting AI examples, right from computer programs such as a simple chess engine to cognitive chatbots, you will learn how to tackle the machine you are competing with. You will study some of the most advanced machine learning models, understand how to apply AI to blockchain and Internet of Things (IoT), and develop emotional quotient in chatbots using neural networks such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs). This edition also has new examples for hybrid neural networks, combining reinforcement learning (RL) and deep learning (DL), chained algorithms, combining unsupervised learning with decision trees, random forests, combining DL and genetic algorithms, conversational user interfaces (CUI) for chatbots, neuromorphic computing, and quantum computing. By the end of this book, you will understand the fundamentals of AI and have worked through a number of examples that will help you develop your AI solutions.
Table of Contents (23 chapters)
21
Other Books You May Enjoy
22
Index

Summary

Although it may seem paradoxical, try to avoid AI before jumping into a project that involves millions to billions of records of data (such as SQL, Oracle, and big data). Try simpler classical solutions like big data methods. If the AI project goes through, LLN will lead to random sampling over the datasets, thanks to CLT.

A pipeline of classical and ML processes will solve the volume problem, as well as the human analytic limit problem. The random sampling function does not need to run a mini-batch function included in the KMC program. Batches can be generated as a preprocessing phase using classical programs. These programs will produce random batches of equal size to the KMC NP-hard problem, transposing it into an NP problem.

KMC, an unsupervised training algorithm, will transform unlabeled data into a labeled data output containing a cluster number as a label.

In turn, a decision tree, chained to the KMC program, will train its model using the output of...