Book Image

Machine Learning Algorithms - Second Edition

Book Image

Machine Learning Algorithms - Second Edition

Overview of this book

Machine learning has gained tremendous popularity for its powerful and fast predictions with large datasets. However, the true forces behind its powerful output are the complex algorithms involving substantial statistical analysis that churn large datasets and generate substantial insight. This second edition of Machine Learning Algorithms walks you through prominent development outcomes that have taken place relating to machine learning algorithms, which constitute major contributions to the machine learning process and help you to strengthen and master statistical interpretation across the areas of supervised, semi-supervised, and reinforcement learning. Once the core concepts of an algorithm have been covered, you’ll explore real-world examples based on the most diffused libraries, such as scikit-learn, NLTK, TensorFlow, and Keras. You will discover new topics such as principal component analysis (PCA), independent component analysis (ICA), Bayesian regression, discriminant analysis, advanced clustering, and gaussian mixture. By the end of this book, you will have studied machine learning algorithms and be able to put them into production to make your machine learning applications more innovative.
Table of Contents (19 chapters)

The Bag-of-Words strategy

In NLP, a very common pipeline can be subdivided into the following steps:

  1. Collecting a document into a corpus
  2. Tokenizing, stopword (articles, prepositions, and so on) removal, and stemming (reduction to radix-form)
  3. Building a common vocabulary
  4. Vectorizing the documents
  5. Classifying or clustering the documents

The pipeline is called Bag-of-Words and will be discussed in this chapter. A fundamental assumption is that the order of every single word in a sentence is not important. In fact, when defining a feature vector, as we're going to see, the measures taken into account are always related to frequencies, and therefore they are insensitive to the local positioning of all elements. From some viewpoints, this is a limitation because in a natural language the internal order of a sentence is necessary to preserve the meaning; however, there are many...