Recent research is revealing data-sonification as a promising complementary approach to vision, benefiting both data perception and interpretation. We present herakoi, a novel open-source software that uses machine learning to allow real-time image sonification, with a focus on astronomical data. By tracking hand movements via a webcam and mapping them to image coordinates, herakoi translates visual properties into sound, enabling users to "hear" images. Its swift responsiveness allows users to access information in astronomical images with short training, demonstrating high reliability and effectiveness. The software has shown promise in educational and outreach settings, making complex astronomical concepts more engaging and accessible to diverse audiences, including blind and visually impaired individuals. We also discuss future developments, such as the integration of large language and vision models to create a more interactive experience in interpreting astronomical data.
翻译:暂无翻译