Automatic medical text simplification can assist providers with patient-friendly communication and make medical texts more accessible, thereby improving health literacy. But curating a quality corpus for this task requires the supervision of medical experts. In this work, we present $\textbf{Med-EASi}$ ($\underline{\textbf{Med}}$ical dataset for $\underline{\textbf{E}}$laborative and $\underline{\textbf{A}}$bstractive $\underline{\textbf{Si}}$mplification), a uniquely crowdsourced and finely annotated dataset for supervised simplification of short medical texts. Its $\textit{expert-layman-AI collaborative}$ annotations facilitate $\textit{controllability}$ over text simplification by marking four kinds of textual transformations: elaboration, replacement, deletion, and insertion. To learn medical text simplification, we fine-tune T5-large with four different styles of input-output combinations, leading to two control-free and two controllable versions of the model. We add two types of $\textit{controllability}$ into text simplification, by using a multi-angle training approach: $\textit{position-aware}$, which uses in-place annotated inputs and outputs, and $\textit{position-agnostic}$, where the model only knows the contents to be edited, but not their positions. Our results show that our fine-grained annotations improve learning compared to the unannotated baseline. Furthermore, $\textit{position-aware}$ control generates better simplification than the $\textit{position-agnostic}$ one. The data and code are available at https://github.com/Chandrayee/CTRL-SIMP.
翻译:自动医疗文本简化可以帮助提供方提供对病人友好的沟通,并使医疗文本更容易获得,从而改善健康知识。但是,要为这项任务制定质量文件,需要医疗专家的监督。在这项工作中,我们为$@textbf{textbf{E ⁇ $ ⁇ $$美元数据集,为$\underline_textbf{E ⁇ $laboariation和$\underline_textb{A ⁇ $bstractive $\underline_comline_tline_textb{Si{Si ⁇ {mplication},这是独一无二的集成和精细的一组数据集。 $truleblebleble{exligendor} 说明$textlifility $rlight_light_lickral_lickral_lickral_lickral_lickral_tal_lickral_tal_tal_tal_tal_tal_tal_slickslick_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_trlick_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_t_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_tal_t_t_tal_t_t_t_tal_tal_t_t_tal_tal_t_t_t_t_t_t_t_t_t_t_t_t_tal_t_t_t_