Book Image

Mastering Java Machine Learning

By : Uday Kamath, Krishna Choppella
Book Image

Mastering Java Machine Learning

By: Uday Kamath, Krishna Choppella

Overview of this book

Java is one of the main languages used by practicing data scientists; much of the Hadoop ecosystem is Java-based, and it is certainly the language that most production systems in Data Science are written in. If you know Java, Mastering Machine Learning with Java is your next step on the path to becoming an advanced practitioner in Data Science. This book aims to introduce you to an array of advanced techniques in machine learning, including classification, clustering, anomaly detection, stream learning, active learning, semi-supervised learning, probabilistic graph modeling, text mining, deep learning, and big data batch and stream machine learning. Accompanying each chapter are illustrative examples and real-world case studies that show how to apply the newly learned techniques using sound methodologies and the best Java-based tools available today. On completing this book, you will have an understanding of the tools and techniques for building powerful machine learning models to solve data science problems in just about any domain.
Table of Contents (20 chapters)
Mastering Java Machine Learning
Credits
Foreword
About the Authors
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface
Linear Algebra
Index

References


  1. D. Bell and H. Wang (2000). A Formalism for Relevance and its Application in Feature Subset Selection. Machine Learning, 41(2):175–195.

  2. J. Doak (1992). An Evaluation of Feature Selection Methods and their Application to Computer Security. Technical Report CSE–92–18, Davis, CA: University of California, Department of Computer Science.

  3. M. Ben-Bassat (1982). Use of Distance Measures, Information Measures and Error Bounds in Feature Evaluation. In P. R. Krishnaiah and L. N. Kanal, editors, Handbook of Statistics, volume 2, pages 773–791, North Holland.

  4. Littlestone N, Warmuth M (1994) The weighted majority algorithm. Information Computing 108(2):212–261

  5. Breiman L., Friedman J.H., Olshen R.A., Stone C.J. (1984) Classification and Regression Trees, Wadsforth International Group.

  6. B. Ripley(1996), Pattern recognition and neural networks. Cambridge University Press, Cambridge.

  7. Breiman, L., (1996). Bagging Predictors, Machine Learning, 24 123-140.

  8. Burges, C. (1998). A tutorial on support vector...