Book Image

Principles of Data Science

Book Image

Principles of Data Science

Overview of this book

Need to turn your skills at programming into effective data science skills? Principles of Data Science is created to help you join the dots between mathematics, programming, and business analysis. With this book, you’ll feel confident about asking—and answering—complex and sophisticated questions of your data to move from abstract and raw statistics to actionable ideas. With a unique approach that bridges the gap between mathematics and computer science, this books takes you through the entire data science pipeline. Beginning with cleaning and preparing data, and effective data mining strategies and techniques, you’ll move on to build a comprehensive picture of how every piece of the data science puzzle fits together. Learn the fundamentals of computational mathematics and statistics, as well as some pseudocode being used today by data scientists and analysts. You’ll get to grips with machine learning, discover the statistical models that help you take control and navigate even the densest datasets, and find out how to create powerful visualizations that communicate what your data means.
Table of Contents (20 chapters)
Principles of Data Science
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Case study 3 – using tensorflow


I would like to finish off our time together by looking at a somewhat more modern module that was only recently introduced by Google's machine learning division called tensorflow.

Tensorflow is an open-source machine learning module that is used primarily for its simplified deep learning and neural network abilities. I would like to take some time to introduce the module and solve a few quick problems using tensorflow. The syntax for tensorflow (like PyBrain in Chapter 12, Beyond the Essentials) is a bit different than our normal scikit-learn syntax so I will be going over it step by step. Let's start with some imports:

from sklearn import datasets, metrics
import tensorflow as tf
import numpy as np
from sklearn.cross_validation import train_test_split
%matplotlib inline

Our imports from sklearn include train_test_split, datasets, and metrics. We will be utilizing our train-test splits to reduce overfitting, we will use datasets in order to import our iris classification...