Book Image

Machine Learning, Data Science and Generative AI with Python [Video]

By : Frank Kane
Book Image

Machine Learning, Data Science and Generative AI with Python [Video]

By: Frank Kane

Overview of this book

This course begins with a Python crash course and then guides you on setting up Microsoft Windows-based PCs, Linux desktops, and Macs. After the setup, we delve into machine learning, AI, and data mining techniques, which include deep learning and neural networks with TensorFlow and Keras; generative models with variational autoencoders and generative adversarial networks; data visualization in Python with Matplotlib and Seaborn; transfer learning, sentiment analysis, image recognition, and classification; regression analysis, K-Means Clustering, Principal Component Analysis, training/testing and cross-validation, Bayesian methods, decision trees, and random forests. Additionally, we will cover multiple regression, multilevel models, support vector machines, reinforcement learning, collaborative filtering, K-Nearest Neighbors, the bias/variance tradeoff, ensemble learning, term frequency/inverse document frequency, experimental design, and A/B testing, feature engineering, hyperparameter tuning, and much more! There's a dedicated section on machine learning with Apache Spark to scale up these techniques to "big data" analyzed on a computing cluster. The course will cover the Transformer architecture, delve into the role of self-attention in AI, explore GPT applications, and practice fine-tuning Transformers for tasks such as movie review analysis. Furthermore, we will look at integrating the OpenAI API for ChatGPT, creating with DALL-E, understanding embeddings, and leveraging audio-to-text to enhance AI with real-world data and moderation.
Table of Contents (15 chapters)
15
You Made It!
Chapter 12
Generative AI: GPT, ChatGPT, Transformers, Self-Attention Based Neural Networks
Content Locked
Section 9
[Activity] Masked, Multi-Headed Self Attention with BERT, BERTViz, and exBERT
In a practical collab notebook demonstration, we delve into the intricate process of self-attention within Transformers using the BERT model. This interactive session showcases, through Bertviz, the critical role of context in determining the significance of words, providing a hands-on understanding of the technology powering modern AI language models.