Article ID Journal Published Year Pages File Type
6884452 Digital Investigation 2018 14 Pages PDF
Abstract
Today the pervasiveness and low-cost of storage disk drives have made digital forensics cumbersome, slow and exorbitant task. Since storage drives are the huge reservoir of digital evidence, examination of these devices requires an enormous amount of analysis time and computing resources. In order to efficiently examine large data volumes a random sector sampling method, subpart of forensic triage, has been utilized in literature to attain admissible investigation outcomes. Conventionally the random sampling method imposes the primary requirement of extensive seek and read requests. This paper presents a unique framework to efficiently utilize the sector hashing and random sampling method towards investigating the existence of target data traces, by independently exploiting the regions of the suspected storage drive. In literature, there is no specific work carried out towards the quantification of the number of random samples required to hit a desired target data traces in storage drives. Also, the standard percentage of random samples is analyzed and proposed, which might be necessary and sufficient to validate the existence of target data in the drive. Several experiments were devised to evaluate the method by considering storage media and target data of different capacities and sizes. It was observed that the size of the target data is an important factor in determining the percentage of sector samples i.e., necessarily required for effectively examining the storage disk drives. In the view of the quantified percentage of random samples, finally, a case study is demonstrated to evaluate the adequacy of the derived metrics.
Related Topics
Physical Sciences and Engineering Computer Science Computer Networks and Communications
Authors
, ,