Book Image

Machine Learning Engineering with Python - Second Edition

By : Andrew P. McMahon
2.5 (2)
Book Image

Machine Learning Engineering with Python - Second Edition

2.5 (2)
By: Andrew P. McMahon

Overview of this book

The Second Edition of Machine Learning Engineering with Python is the practical guide that MLOps and ML engineers need to build solutions to real-world problems. It will provide you with the skills you need to stay ahead in this rapidly evolving field. The book takes an examples-based approach to help you develop your skills and covers the technical concepts, implementation patterns, and development methodologies you need. You'll explore the key steps of the ML development lifecycle and create your own standardized "model factory" for training and retraining of models. You'll learn to employ concepts like CI/CD and how to detect different types of drift. Get hands-on with the latest in deployment architectures and discover methods for scaling up your solutions. This edition goes deeper in all aspects of ML engineering and MLOps, with emphasis on the latest open-source and cloud-based technologies. This includes a completely revamped approach to advanced pipelining and orchestration techniques. With a new chapter on deep learning, generative AI, and LLMOps, you will learn to use tools like LangChain, PyTorch, and Hugging Face to leverage LLMs for supercharged analysis. You will explore AI assistants like GitHub Copilot to become more productive, then dive deep into the engineering considerations of working with deep learning.
Table of Contents (12 chapters)
10
Other Books You May Enjoy
11
Index

Living it large with LLMs

At the time of writing, GPT-4 has been released only a few months previously, in March 2023, by OpenAI. This model is potentially the largest ML model ever developed, with a reported one trillion parameters, although OpenAI has not confirmed the exact number. Since then, Microsoft and Google have announced advanced chat capabilities using similarly large models in their product suites and a raft of open-source packages and toolkits have been released. All of these solutions leverage some of the largest neural network models ever developed, LLMs. LLMs are part of an even wider class of models known as foundation models, which span not just text applications but video and audio as well. These models are roughly classified by the author as being too large for most organizations to consider training from scratch. This will mean organizations will either consume these models as third-party services or host and then fine-tune existing models. Solving this integration...