Two other collection approaches require consideration when collecting HDFS data: custom-developed Java programs and third-party collections. The first, using custom-developed Java programs, collects HDFS data utilizing the Hadoop Java API and standard Java methods. Two of the drawbacks to collecting HDFS data through shell commands are that the Java Virtual Machine (JVM) has to start and finish for every copy command, considerably slowing down the copy process, and MD5 computations can only be performed after the file has been copied. Hadoop's Java API provides methods for calculating the MD5 of files inside HDFS, and the program can perform all copies inside a single JVM session. The drawback is that custom-developed Java solutions require significant testing and some Hadoop Java API methods are still under development. For this reason, the investigator should carefully develop and test any program used to perform the collection to avoid unintended behavior...
Big Data Forensics: Learning Hadoop Investigations
Big Data Forensics: Learning Hadoop Investigations
Overview of this book
Table of Contents (15 chapters)
Big Data Forensics – Learning Hadoop Investigations
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Free Chapter
Starting Out with Forensic Investigations and Big Data
Understanding Hadoop Internals and Architecture
Identifying Big Data Evidence
Collecting Hadoop Distributed File System Data
Collecting Hadoop Application Data
Performing Hadoop Distributed File System Analysis
Analyzing Hadoop Application Data
Presenting Forensic Findings
Index
Customer Reviews