Book Image

Big Data Forensics: Learning Hadoop Investigations

Book Image

Big Data Forensics: Learning Hadoop Investigations

Overview of this book

Table of Contents (15 chapters)
Big Data Forensics – Learning Hadoop Investigations
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Index

Other HDFS collection approaches


Two other collection approaches require consideration when collecting HDFS data: custom-developed Java programs and third-party collections. The first, using custom-developed Java programs, collects HDFS data utilizing the Hadoop Java API and standard Java methods. Two of the drawbacks to collecting HDFS data through shell commands are that the Java Virtual Machine (JVM) has to start and finish for every copy command, considerably slowing down the copy process, and MD5 computations can only be performed after the file has been copied. Hadoop's Java API provides methods for calculating the MD5 of files inside HDFS, and the program can perform all copies inside a single JVM session. The drawback is that custom-developed Java solutions require significant testing and some Hadoop Java API methods are still under development. For this reason, the investigator should carefully develop and test any program used to perform the collection to avoid unintended behavior...