Book Image

Machine Learning in Java - Second Edition

By : AshishSingh Bhatia, Bostjan Kaluza
Book Image

Machine Learning in Java - Second Edition

By: AshishSingh Bhatia, Bostjan Kaluza

Overview of this book

As the amount of data in the world continues to grow at an almost incomprehensible rate, being able to understand and process data is becoming a key differentiator for competitive organizations. Machine learning applications are everywhere, from self-driving cars, spam detection, document search, and trading strategies, to speech recognition. This makes machine learning well-suited to the present-day era of big data and Data Science. The main challenge is how to transform data into actionable knowledge. Machine Learning in Java will provide you with the techniques and tools you need. You will start by learning how to apply machine learning methods to a variety of common tasks including classification, prediction, forecasting, market basket analysis, and clustering. The code in this book works for JDK 8 and above, the code is tested on JDK 11. Moving on, you will discover how to detect anomalies and fraud, and ways to perform activity recognition, image recognition, and text analysis. By the end of the book, you will have explored related web resources and technologies that will help you take your learning to the next level. By applying the most effective machine learning methods to real-world problems, you will gain hands-on experience that will transform the way you think about data.
Table of Contents (13 chapters)

Advanced modeling with ensembles

In the previous section, we implemented an orientation baseline; now, let's focus on heavy machinery. We will follow the approach taken by the KDD Cup 2009 winning solution, developed by the IBM research team (Niculescu-Mizil and others).

To address this challenge, they used the ensemble selection algorithm (Caruana and Niculescu-Mizil, 2004). This is an ensemble method, which means it constructs a series of models and combines their output in a specific way, in order to provide the final classification. It has several desirable properties that make it a good fit for this challenge, as follows:

  • It was proven to be robust, yielding excellent performance.
  • It can be optimized for a specific performance metric, including AUC.
  • It allows for different classifiers to be added to the library.
  • It is an anytime method, meaning that if we run out of...