Book Image

Machine Learning for Developers

By : Rodolfo Bonnin, Md Mahmudul Hasan
Book Image

Machine Learning for Developers

By: Rodolfo Bonnin, Md Mahmudul Hasan

Overview of this book

Most of us have heard about the term Machine Learning, but surprisingly the question frequently asked by developers across the globe is, “How do I get started in Machine Learning?”. One reason could be attributed to the vastness of the subject area because people often get overwhelmed by the abstractness of ML and terms such as regression, supervised learning, probability density function, and so on. This book is a systematic guide teaching you how to implement various Machine Learning techniques and their day-to-day application and development. You will start with the very basics of data and mathematical models in easy-to-follow language that you are familiar with; you will feel at home while implementing the examples. The book will introduce you to various libraries and frameworks used in the world of Machine Learning, and then, without wasting any time, you will get to the point and implement Regression, Clustering, classification, Neural networks, and more with fun examples. As you get to grips with the techniques, you’ll learn to implement those concepts to solve real-world scenarios for ML applications such as image analysis, Natural Language processing, and anomaly detections of time series data. By the end of the book, you will have learned various ML techniques to develop more efficient and intelligent applications.
Table of Contents (10 chapters)

Origin of convolutional neural networks

Convolutional neural networks (CNNs) have a remote origin. They developed while multi-layer perceptrons were perfected, and the first concrete example is the neocognitron.

The neocognitron is a hierarchical, multilayered Artificial Neural Network (ANN), and was introduced in a 1980 paper by Prof. Fukushima and has the following principal features:

  • Self-organizing
  • Tolerant to shifts and deformation in the input

This original idea appeared again in 1986 in the book version of the original backpropagation paper, and was also employed in 1988 for temporal signals in speech recognition.

The design was improved in 1998, with a paper from Ian LeCun, Gradient-Based Learning Aapplied to Document Recognition, presenting the LeNet-5 network, an architecture used to classify handwritten digits. The model showed increased performance compared to other...