Book Image

Hyper-V 2016 Best Practices

By : Romain Serre, Benedict Berger
Book Image

Hyper-V 2016 Best Practices

By: Romain Serre, Benedict Berger

Overview of this book

Hyper-V Server and Windows Server 2016 with Hyper-V provide best-in-class virtualization capabilities. Hyper-V is a Windows-based, very cost-effective virtualization solution with easy-to-use and well-known administrative consoles. This book will assist you in designing, implementing, and managing highly effective and highly available Hyper-V infrastructures. With an example-oriented approach, this book covers all the different tips and suggestions to configure Hyper-V and provides readers with real-world proven solutions. This book begins by deploying single clusters of High Availability Hyper-V systems including the new Nano Server. This is followed by steps to configure the Hyper-V infrastructure components such as storage and network. It also touches on necessary processes such as backup and disaster recovery for optimal configuration. The book does not only show you what to do and how to plan the different scenarios, but it also provides in-depth configuration options. These scalable and automated configurations are then optimized via performance tuning and central management ensuring your applications are always the best they can be.
Table of Contents (15 chapters)
Hyper-V 2016 Best Practices
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Preface

Data deduplication


Windows Server 2016 with Hyper-V offers built-in deduplication at no extra charge. It's a great way to reduce your storage capacity footprint with very little configuration. However, data deduplication still comes at a price-it requires additional I/O capacity. Therefore, on a general use file server, it will not affect hot data until it's reached a certain file change age. Besides the I/O-hit, volumes with active deduplication will fragment more easily causing single file operations to take longer on deduped volumes. Hyper-V takes some precautions to avoid a big performance hit, that is, every block referenced more than 100 times will be written a second time.

Real-life experiences tell us that the overall gain in saved space outweighs the performance cost on file servers, library servers, and VDI Hyper-V hosts. Running Windows data deduplication on running VMs with server workloads is not supported. Before using deduplication, you can test how much space saving dedup...