Book Image

Deep Learning with TensorFlow

By : Giancarlo Zaccone, Md. Rezaul Karim, Ahmed Menshawy
Book Image

Deep Learning with TensorFlow

By: Giancarlo Zaccone, Md. Rezaul Karim, Ahmed Menshawy

Overview of this book

Deep learning is the step that comes after machine learning, and has more advanced implementations. Machine learning is not just for academics anymore, but is becoming a mainstream practice through wide adoption, and deep learning has taken the front seat. As a data scientist, if you want to explore data abstraction layers, this book will be your guide. This book shows how this can be exploited in the real world with complex raw data using TensorFlow 1.x. Throughout the book, you’ll learn how to implement deep learning algorithms for machine learning systems and integrate them into your product offerings, including search, image recognition, and language processing. Additionally, you’ll learn how to analyze and improve the performance of deep learning models. This can be done by comparing algorithms against benchmarks, along with machine intelligence, to learn from the information and determine ideal behaviors within a specific context. After finishing the book, you will be familiar with machine learning techniques, in particular the use of TensorFlow for deep learning, and will be ready to apply your knowledge to research or commercial projects.
Table of Contents (11 chapters)

GPU programming model

At this point it is necessary to introduce some basic concepts to understand the CUDA programming model. The first distinction is between host and device.

The code executed in the host side is the part of code executed on the CPU, and this will also include the RAM and the hard disk.

However, the code executed on the device is automatically loaded on the graphic card and run on the latter. Another important concept is the kernel; it stands for a function performed on the device and launched from the host.

The code defined in the kernel will be performed in parallel by an array of threads. The following figure summarizes how the GPU programming model works:

  • The running program will have source code to run on CPU and code to run on GPU
  • CPU and GPU have separated memories
  • The data is transferred from CPU to GPU to be computed
  • The data output from GPU computation is copied back to CPU memory
...