We present an example implementation of the previously published Malware Analysis Tool Evaluation Framework (MATEF) to explore if a systematic basis for trusted practice can be established for evaluating malware artefact detection tools used within a forensic investigation. The application of the framework is demonstrated through a case study which presents the design of two example experiments that consider the hypotheses: (1) Is there an optimal length of time in which to execution malware for analysis and (2) Is there any observable difference between tools when observing malware behaviour? The experiments used a sample of 4,800 files known to produce network artefacts. These were selected at random from a library of over 350,000 malware binaries. The tools Process Monitor and TCPVCon, popular in the digital forensic community, are chosen as the subjects for investigating these two questions. The results indicate that it is possible to use the MATEF to identify an optimal execution time for a software tool used to monitor activity generated by malware.
翻译:我们举例介绍了以前出版的“恶意分析工具评估框架”的执行情况,以探讨是否可以为评估法医调查中使用的恶意软件天体检测工具建立系统化的可靠做法基础;通过案例研究展示了框架的应用情况,该案例研究介绍了两个实例实验的设计,这些实验考虑了这些假设:(1) 执行恶意软件用于分析的时间是否最短;(2) 在观察恶意软件行为时,各种工具之间是否存在可观察到的差异?实验使用了已知制作网络工艺品的4 800个文件样本。这些样本是随机从一个超过350 000个恶意软件资料库中挑选的。在数字法医界很受欢迎的工具“程序监测器”和“TCPVCon”被选为调查这两个问题的主体。结果显示,有可能使用“恶意软件”来确定用于监测恶意软件所产生活动的软件工具的最佳执行时间。