Book Image

Mastering Scala Machine Learning

By : Kozlov
Book Image

Mastering Scala Machine Learning

By: Kozlov

Overview of this book

Since the advent of object-oriented programming, new technologies related to Big Data are constantly popping up on the market. One such technology is Scala, which is considered to be a successor to Java in the area of Big Data by many, like Java was to C/C++ in the area of distributed programing. This book aims to take your knowledge to next level and help you impart that knowledge to build advanced applications such as social media mining, intelligent news portals, and more. After a quick refresher on functional programming concepts using REPL, you will see some practical examples of setting up the development environment and tinkering with data. We will then explore working with Spark and MLlib using k-means and decision trees. Most of the data that we produce today is unstructured and raw, and you will learn to tackle this type of data with advanced topics such as regression, classification, integration, and working with graph algorithms. Finally, you will discover at how to use Scala to perform complex concept analysis, to monitor model performance, and to build a model repository. By the end of this book, you will have gained expertise in performing Scala machine learning and will be able to build complex machine learning projects using Scala.
Table of Contents (12 chapters)
10
10. Advanced Model Monitoring
11
Index

Regularization


The regularization was originally developed to cope with ill-poised problems, where the problem was underconstrained—allowed multiple solutions given the data—or the data and the solution that contained too much noise (A.N. Tikhonov, A.S. Leonov, A.G. Yagola. Nonlinear Ill-Posed Problems, Chapman and Hall, London, Weinhe). Adding additional penalty function that skews a solution if it does not have a desired property, such as the smoothness in curve fitting or spectral analysis, usually solves the problem.

The choice of the penalty function is somewhat arbitrary, but it should reflect a desired skew in the solution. If the penalty function is differentiable, it can be incorporated into the gradient descent process; ridge regression is an example where the penalty is the metric for the weights or the sum of squares of the coefficients.

MLlib currently implements , , and a mixture thereof called Elastic Net, as was shown in Chapter 3, Working with Spark and MLlib. The regularization...