Book Image

Data Science Algorithms in a Week

By : Dávid Natingga
Book Image

Data Science Algorithms in a Week

By: Dávid Natingga

Overview of this book

<p>Machine learning applications are highly automated and self-modifying, and they continue to improve over time with minimal human intervention as they learn with more data. To address the complex nature of various real-world data problems, specialized machine learning algorithms have been developed that solve these problems perfectly. Data science helps you gain new knowledge from existing data through algorithmic and statistical analysis.</p> <p>This book will address the problems related to accurate and efficient data classification and prediction. Over the course of 7 days, you will be introduced to seven algorithms, along with exercises that will help you learn different aspects of machine learning. You will see how to pre-cluster your data to optimize and classify it for large datasets. You will then find out how to predict data based on the existing trends in your datasets.</p> <p>This book covers algorithms such as: k-Nearest Neighbors, Naive Bayes, Decision Trees, Random Forest, k-Means, Regression, and Time-series. On completion of the book, you will understand which machine learning algorithm to pick for clustering, classification, or regression and which is best suited for your problem.</p>
Table of Contents (12 chapters)
11
Glossary of Algorithms and Methods in Data Science

Gradient descent algorithm and its implementation

To understand better how we may be able to predict a value using linear regression from first principles, we study a gradient descent algorithm and then implement it in Python.

Gradient descent algorithm

A gradient descent algorithm is an iterative algorithm updating the variables in the model to fit the data with the least error. More generally, it finds a minimum of a function.

We would like to express the weight in terms of the height using a linear formula:

weight(height,p)=p1*height+p0

We estimate the parameter p=(p0,p1) using n data samples (heighti,weighti) to minimize the following square error:


The gradient descent algorithm does it by updating the parameter pi in...