Book Image

Machine Learning for Emotion Analysis in Python

By : Allan Ramsay, Tariq Ahmad
5 (1)
Book Image

Machine Learning for Emotion Analysis in Python

5 (1)
By: Allan Ramsay, Tariq Ahmad

Overview of this book

Artificial intelligence and machine learning are the technologies of the future, and this is the perfect time to tap into their potential and add value to your business. Machine Learning for Emotion Analysis in Python helps you employ these cutting-edge technologies in your customer feedback system and in turn grow your business exponentially. With this book, you’ll take your foundational data science skills and grow them in the exciting realm of emotion analysis. By following a practical approach, you’ll turn customer feedback into meaningful insights assisting you in making smart and data-driven business decisions. The book will help you understand how to preprocess data, build a serviceable dataset, and ensure top-notch data quality. Once you’re set up for success, you’ll explore complex ML techniques, uncovering the concepts of deep neural networks, support vector machines, conditional probabilities, and more. Finally, you’ll acquire practical knowledge using in-depth use cases showing how the experimental results can be transformed into real-life examples and how emotion mining can help track short- and long-term changes in public opinion. By the end of this book, you’ll be well-equipped to use emotion mining and analysis to drive business decisions.
Table of Contents (18 chapters)
1
Part 1:Essentials
3
Part 2:Building and Using a Dataset
7
Part 3:Approaches
14
Part 4:Case Study

References

To learn more about the topics that were covered in this chapter, take a look at the following resources:

  • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł. and Polosukhin, I., 2017. Attention Is All You Need. Advances in neural information processing systems, 30.
  • Rothman, D., 2021. Transformers for Natural Language Processing: Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more. Packt Publishing Ltd.
  • Devlin, J., Chang, M. W., Lee, K., and Toutanova, K., 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
  • Clark, K., Luong, M. T., Le, Q. V. and Manning, C. D., 2020. Electra: Pre-training text encoders as discriminators rather than generators. arXiv preprint arXiv:2003.10555.
  • Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., Neelakantan, A., Shyam, P...