Book Image

Cloud Forensics Demystified

By : Ganesh Ramakrishnan, Mansoor Haqanee
Book Image

Cloud Forensics Demystified

By: Ganesh Ramakrishnan, Mansoor Haqanee

Overview of this book

As organizations embrace cloud-centric environments, it becomes imperative for security professionals to master the skills of effective cloud investigation. Cloud Forensics Demystified addresses this pressing need, explaining how to use cloud-native tools and logs together with traditional digital forensic techniques for a thorough cloud investigation. The book begins by giving you an overview of cloud services, followed by a detailed exploration of the tools and techniques used to investigate popular cloud platforms such as Amazon Web Services (AWS), Azure, and Google Cloud Platform (GCP). Progressing through the chapters, you’ll learn how to investigate Microsoft 365, Google Workspace, and containerized environments such as Kubernetes. Throughout, the chapters emphasize the significance of the cloud, explaining which tools and logs need to be enabled for investigative purposes and demonstrating how to integrate them with traditional digital forensic tools and techniques to respond to cloud security incidents. By the end of this book, you’ll be well-equipped to handle security breaches in cloud-based environments and have a comprehensive understanding of the essential cloud-based logs vital to your investigations. This knowledge will enable you to swiftly acquire and scrutinize artifacts of interest in cloud security incidents.
Table of Contents (18 chapters)
Free Chapter
1
Part 1: Cloud Fundamentals
6
Part 2: Forensic Readiness: Tools, Techniques, and Preparation for Cloud Forensics
10
Part 3: Cloud Forensic Analysis – Responding to an Incident in the Cloud

Logging Dataflow pipelines

Dataflow pipelines provide a stream of data or batch processing capabilities at scale. GCP’s Dataflow pipeline is based on Apache Beam. Logs can be streamed at variable volumes in near real time using Dataflow applications.

Any actions performed on GCP Dataflow are recorded by default in Logs Explorer. Through Logs Explorer, investigators can detect any changes to the Dataflow parameters or whether unauthorized users altered the pipeline.

Note that a Docker instance forms the base of any Dataflow pipeline’s operations. Therefore, investigators must also investigate the logs emitted by the GKE cluster and GCE Instance Group Manager. GCP relies on Instance Group Manager to create multiple managed VMs that run the containers (GKE) to handle instance resourcing and deploying VMs automatically.

The following figure outlines some sample resources required for successful Dataflow pipeline execution. Like Syslog, Dataflow events are tagged...