Intensive care units (ICUs) are complex and data-rich environments. Data routinely collected in the ICUs provides tremendous opportunities for machine learning, but their use comes with significant challenges. Complex problems may require additional input from humans which can be provided through a process of data annotation. Annotation is a complex, time-consuming process that requires domain expertise and technical proficiency. Existing data annotation tools fail to provide an effective solution to this problem. In this study, we investigated clinicians' approach to the annotation task. We focused on establishing the characteristics of the annotation process in the context of clinical data and identifying differences in the annotation workflow between different staff roles. The overall goal was to elicit requirements for a software tool that could facilitate an effective and time-efficient data annotation. We conducted an experiment involving clinicians from the ICUs annotating printed sheets of data. The participants were observed during the task and their actions were analysed in the context of Norman's Interaction Cycle to establish the requirements for the digital tool. The annotation process followed a constant loop of annotation and evaluation, during which participants incrementally analysed and annotated the data. No distinguishable differences were identified between how different staff roles annotate data. We observed preferences towards different methods for applying annotation which varied between different participants and admissions. We established 11 requirements for the digital data annotation tool for the healthcare setting. We conducted a manual data annotation activity to establish the requirements for a digital data annotation tool, characterised the clinicians' approach to annotation and elicited 11 key requirements for effective data annotation software.
翻译:暂无翻译