Book Image

Scala Machine Learning Projects

Book Image

Scala Machine Learning Projects

Overview of this book

Machine learning has had a huge impact on academia and industry by turning data into actionable information. Scala has seen a steady rise in adoption over the past few years, especially in the fields of data science and analytics. This book is for data scientists, data engineers, and deep learning enthusiasts who have a background in complex numerical computing and want to know more hands-on machine learning application development. If you're well versed in machine learning concepts and want to expand your knowledge by delving into the practical implementation of these concepts using the power of Scala, then this book is what you need! Through 11 end-to-end projects, you will be acquainted with popular machine learning libraries such as Spark ML, H2O, DeepLearning4j, and MXNet. At the end, you will be able to use numerical computing and functional programming to carry out complex numerical tasks to develop, build, and deploy research or commercial projects in a production-ready environment.
Table of Contents (17 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Tuning and optimizing CNN hyperparameters


The following hyperparameters are very important and must be tuned to achieve optimized results.

  • Dropout: Used for random omission of feature detectors to prevent overfitting
  • Sparsity: Used to force activations of sparse/rare inputs
  • Adagrad: Used for feature-specific learning-rate optimization
  • Regularization: L1 and L2 regularization
  • Weight transforms: Useful for deep autoencoders
  • Probability distribution manipulation: Used for initial weight generation
  • Gradient normalization and clipping

Another important question is: when do you want to add a max pooling layer rather than a convolutional layer with the same stride? A max pooling layer has no parameters at all, whereas a convolutional layer has quite a few. Sometimes, adding a local response normalization layer that makes the neurons that most strongly activate inhibit neurons at the same location but in neighboring feature maps, encourages different feature maps to specialize and pushes them apart, forcing...