Book Image

Cracking the Data Science Interview

By : Leondra R. Gonzalez, Aaren Stubberfield
Book Image

Cracking the Data Science Interview

By: Leondra R. Gonzalez, Aaren Stubberfield

Overview of this book

The data science job market is saturated with professionals of all backgrounds, including academics, researchers, bootcampers, and Massive Open Online Course (MOOC) graduates. This poses a challenge for companies seeking the best person to fill their roles. At the heart of this selection process is the data science interview, a crucial juncture that determines the best fit for both the candidate and the company. Cracking the Data Science Interview provides expert guidance on approaching the interview process with full preparation and confidence. Starting with an introduction to the modern data science landscape, you’ll find tips on job hunting, resume writing, and creating a top-notch portfolio. You’ll then advance to topics such as Python, SQL databases, Git, and productivity with shell scripting and Bash. Building on this foundation, you'll delve into the fundamentals of statistics, laying the groundwork for pre-modeling concepts, machine learning, deep learning, and generative AI. The book concludes by offering insights into how best to prepare for the intensive data science interview. By the end of this interview guide, you’ll have gained the confidence, business acumen, and technical skills required to distinguish yourself within this competitive landscape and land your next data science job.
Table of Contents (21 chapters)
Free Chapter
1
Part 1: Breaking into the Data Science Field
4
Part 2: Manipulating and Managing Data
10
Part 3: Exploring Artificial Intelligence
16
Part 4: Getting the Job

Unraveling backpropagation

At this point, you may be wondering why weights, biases, and activation functions are so special. After all, at this point, they probably seem not much different than parameters and hyperparameters in traditional ML models. However, understanding backpropagation will solidify your appreciation of how weights and biases work. This journey begins with a brief discussion of gradient descent.

Gradient descent

In short, gradient descent is a powerful optimization algorithm that’s widely used in ML and DL to minimize a cost or loss function. It is the name that’s given to the process of training a model on a task by first making a prediction with the model, measuring how good that prediction is, and then adjusting its weights slightly so that it will perform better next time. This process allows the model to gradually make better predictions over many iterations of training. It is used to train not only NNs but also other ML models, such as...