#### Overview of this book

Java is one of the main languages used by practicing data scientists; much of the Hadoop ecosystem is Java-based, and it is certainly the language that most production systems in Data Science are written in. If you know Java, Mastering Machine Learning with Java is your next step on the path to becoming an advanced practitioner in Data Science. This book aims to introduce you to an array of advanced techniques in machine learning, including classification, clustering, anomaly detection, stream learning, active learning, semi-supervised learning, probabilistic graph modeling, text mining, deep learning, and big data batch and stream machine learning. Accompanying each chapter are illustrative examples and real-world case studies that show how to apply the newly learned techniques using sound methodologies and the best Java-based tools available today. On completing this book, you will have an understanding of the tools and techniques for building powerful machine learning models to solve data science problems in just about any domain.
Mastering Java Machine Learning
Credits
Foreword
www.PacktPub.com
Customer Feedback
Preface
Free Chapter
Machine Learning Review
Practical Approach to Real-World Supervised Learning
Unsupervised Machine Learning Techniques
Semi-Supervised and Active Learning
Real-Time Stream Machine Learning
Probabilistic Graph Modeling
Deep Learning
Text Mining and Natural Language Processing
Big Data Machine Learning – The Final Frontier
Linear Algebra
Probability
Index

## Probability revisited

Many basic concepts of probability are detailed in Appendix B, Probability. Some of the key ideas in probability theory form the building blocks of probabilistic graph models. A good grasp of the relevant theory can help a great deal in understanding PGMs and how they are used to make inferences from data.

### Concepts in probability

In this section, we will discuss important concepts related to probability theory that will be used in the discussion later in this chapter.

#### Conditional probability

The essence of conditional probability, given two related events a and ß, is to capture how we assign a value for one of the events when the other is known to have occurred. The conditional probability, or the conditional distribution, is represented by P(a | ß), that is, the probability of event a happening given that the event ß has occurred (equivalently, given that ß is true) and is formally defined as:

The P(a n ß) captures the events where both a and ß occur.