Book Image

Machine Learning with Swift

By : Jojo Moolayil, Alexander Sosnovshchenko, Oleksandr Baiev
Book Image

Machine Learning with Swift

By: Jojo Moolayil, Alexander Sosnovshchenko, Oleksandr Baiev

Overview of this book

Machine learning as a field promises to bring increased intelligence to the software by helping us learn and analyse information efficiently and discover certain patterns that humans cannot. This book will be your guide as you embark on an exciting journey in machine learning using the popular Swift language. We’ll start with machine learning basics in the first part of the book to develop a lasting intuition about fundamental machine learning concepts. We explore various supervised and unsupervised statistical learning techniques and how to implement them in Swift, while the third section walks you through deep learning techniques with the help of typical real-world cases. In the last section, we will dive into some hard core topics such as model compression, GPU acceleration and provide some recommendations to avoid common mistakes during machine learning application development. By the end of the book, you'll be able to develop intelligent applications written in Swift that can learn for themselves.
Table of Contents (18 chapters)
Title Page
Packt Upsell

Word2Vec friends and relatives

GloVE, Lexvec FastText.

One popular alternative to word2vec is GloVe (Global Vectors).


Doc2Vec - Efficient Vector Representation for Documents Through Corruption.

Both models learn geometrical encodings (vectors) of words from their co-occurrence information (how frequently they appear together in large text corpora). They differ in that word2vec is a "predictive" model, whereas GloVe is a "count-based" model. See this paper for more on the distinctions between these two approaches:

Predictive models learn their vectors in order to improve their predictive ability of Loss(target word | context words; Vectors), that is, the loss of predicting the target words from the context words given the vector representations. In Word2Vec, this is cast as a feed-forward neural network and optimized as such using SGD, and so on.

Count-based models learn their vectors by...