Machine learning models
In this chapter, the text augmentation wrapper functions use ML to generate new text for training the ML model. Understanding how these models are built is not in scope, but a brief description of these ML models and their algorithms is necessary. The Python wrapper functions will use the following ML models under the hood:
- Tomáš Mikolov published the NLP algorithm using a neural network named Word2Vec in 2013. The model can propose synonym words from the input text.
- The Global Vectors for Word Representation (GloVe) algorithm was created by Jeffrey Pennington, Richard Socher, and Christopher D. Manning in 2014. It is an unsupervised learning NLP algorithm for representing words in vector format. The results are a linear algorithm that groups the closest neighboring words.
- Wiki-news-300d-1M is a pre-trained ML model that uses the fastText open source library. It was trained on 1 million words from Wikipedia 2017 articles, the UMBC...