Book Image

Python Machine Learning By Example

By : Yuxi (Hayden) Liu
Book Image

Python Machine Learning By Example

By: Yuxi (Hayden) Liu

Overview of this book

Data science and machine learning are some of the top buzzwords in the technical world today. A resurging interest in machine learning is due to the same factors that have made data mining and Bayesian analysis more popular than ever. This book is your entry point to machine learning. This book starts with an introduction to machine learning and the Python language and shows you how to complete the setup. Moving ahead, you will learn all the important concepts such as, exploratory data analysis, data preprocessing, feature extraction, data visualization and clustering, classification, regression and model performance evaluation. With the help of various projects included, you will find it intriguing to acquire the mechanics of several important machine learning algorithms – they are no more obscure as they thought. Also, you will be guided step by step to build your own models from scratch. Toward the end, you will gather a broad picture of the machine learning ecosystem and best practices of applying machine learning techniques. Through this book, you will learn to tackle data-driven problems and implement your solutions with the powerful yet simple language, Python. Interesting and easy-to-follow examples, to name some, news topic classification, spam email detection, online ad click-through prediction, stock prices forecast, will keep you glued till you reach your goal.
Table of Contents (9 chapters)

The mechanics of naive Bayes

We start with understanding the magic behind the algorithm-how naive Bayes works. Given a data sample x with n features x1, x2, ..., xn (x represents a feature vector and x = (x1, x2, ..., xn)), the goal of naive Bayes is to determine the probabilities that this sample belongs to each of K possible classes y1, y2, ..., yK, that is or , where k = 1, 2, ..., K. It looks no different from what we have just dealt with: x or x1, x2, ..., xn is a joint event that the sample has features with values x1, x2, ..., xn respectively, yk is an event that the sample belongs to class k. We can apply Bayes' theorem right away:

portrays how classes are distributed, provided no further knowledge of observation features. Thus, it is also called prior in Bayesian probability terminology. Prior can be either predetermined (usually in a uniform manner where each class has an equal chance of occurrence...