References
[1] Vectors:https://en.wikipedia.org/wiki/Euclidean_vector
[2] Scalability of Semantic Analysis in Natural Language Processing: https://radimrehurek.com/phd_rehurek.pdf
[3] Latent Dirichlet allocation: https://en.wikipedia.org/wiki/Latent_Dirichlet_allocation
[4] Latent semantic indexing: https://en.wikipedia.org/wiki/Latent_semantic_analysis#Latent_semantic_indexing
[5] Random Projection: https://en.wikipedia.org/wiki/Random_projection
[6] Stanford TMT: https://nlp.stanford.edu/software/tmt/tmt-0.4/
[7] Gensim notebooks: https://github.com/RaRe-Technologies/gensim/tree/develop/docs/notebooks
[8] Jupyter Notebooks: http://jupyter-notebook.readthedocs.io/en/stable/notebook.html
[9] Vector Space Models: https://en.wikipedia.org/wiki/Vector_space_model
[10] Bayesian Probability: https://en.wikipedia.org/wiki/Bayesian_probability
[11] TF-IDF: https://en.wikipedia.org/wiki/Tf-idf
[12] The Amazing power of word vectors: https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/
[13...